All techniques above assume a perfect model of the system.
All the uncertainities can result in erroneous control calculation and undesired system behavior.
with measurements
$$ Y = CX. $$with measurements,
$$ y =C X +g(X,u) $$Bayesian inference is based on Bayes theorem, and gives probability of an event based on features that may be related to an event.
Say we have a test for cancer that is 99% accurate, i.e. given a person has cancer, the test gives a positive test 99% of the time. This is also referred as the true-positive rate or sensitivity. Further, if a person does not have cancer, the test gives a negative result 98% of the time, or the true negative rate/specificity is 98%. Knowing that cancer is a rare disease, and only 1.5% of the population has it, if the test comes back positive for some individual, what is the probaility that the individual actually has cancer? Bayes' rule is given by,
$$ P(A|B) = \frac{P(B|A) P(A)}{P(B|A) P(A) + P(B| \neg A) P(\neg A)} , $$We wish to determine \( P(C|+) \), where C stands for cancer, NC stands for no cancer, - stands for negative test result, and + stands for a positive test result. The given information about cancer and test can now be written as,
Using Bayes rule,
$$ P(C|+) = \frac{P(+|C) P(C)}{P(+|C) P(C) + P(+| NC) P(NC)} , $$Note: The process of multiplying functions and adding them up is also called convolution
where, \( \mu \) is the mean and \( \sigma \) is the standard deviation of the distribution.
clc;close all;clear all
X = -10:0.01:10;
mu = 0;
sigma = 1.25;
f_x = 1/sqrt(2*pi*sigma^2) * exp( -(X-mu).^2/(2*sigma^2) );
figure;
plot(X,f_x)
axis([-10 10 0 .4])
xlabel('X')
ylabel('Probability distribution')
Consider two normal probability distributions given by,
$$ f(X| \mu_1, \sigma_1^2) = \frac{1}{\sqrt{2 \pi \sigma_1^2}} e^{ - \frac{(X-\mu_1)^2}{2 \sigma_1^2}} $$ $$ f(X| \mu_2, \sigma_2^2) = \frac{1}{\sqrt{2 \pi \sigma_2^2}} e^{ - \frac{(X-\mu_2)^2}{2 \sigma_2^2}} $$The Bayesian probability is given by,
$$ f((X| \mu_1, \sigma_1^2)|(X| \mu_2, \sigma_2^2)) = \frac{f(X| \mu_2, \sigma_2^2) f(X| \mu_2, \sigma_2^2)}{\int_{-\infty}^{\infty}f(X| \mu_2, \sigma_2^2) f(X| \mu_2, \sigma_2^2)dX}$$ $$ f((X| \mu_1, \sigma_1^2)|(X| \mu_2, \sigma_2^2)) = \frac{1}{\sqrt{2 \pi \sigma_1^2}} e^{ - \frac{(X-\mu_1)^2}{2 \sigma_1^2}}\frac{1}{\sqrt{2 \pi \sigma_2^2}} e^{ - \frac{(X-\mu_2)^2}{2 \sigma_2^2}} $$After significant algebra, the final conditional probability, can be represented as,
$$ f(X| \mu_1, \sigma_1^2) | f(X| \mu_2, \sigma_2^2) = f(X| \mu_{12}, \sigma_{12}^2) = \frac{1}{\sqrt{2 \pi \sigma_{12}^2}} e^{ - \frac{(X-\mu_{12})^2}{2 \sigma_{12}^2}} , $$where,
$$ \sigma_{12}^2 = \frac{\sigma_1^2 \sigma_2^2}{\sigma_1^2+\sigma_2^2} $$and
$$ \mu_{12} = \frac{\sigma_1^2 \mu_2 + \sigma_2^2 \mu_1}{\sigma_1^2+\sigma_2^2} . $$and
$$ \mu_{12} = \frac{\sigma_1^2 \mu_2 + \sigma_2^2 \mu_1}{\sigma_1^2+\sigma_2^2}=.66 . $$clc;close all;clear all
X = -10:0.01:10;
mu1 = .8 ; sigma1 = 2;mu2 = .1 ; sigma2 = 4;
mu12 = (sigma1^2*mu2 + sigma2^2*mu1)/(sigma1^2 + sigma2^2);
sigma12 = sqrt((sigma1^2*sigma2^2)/(sigma1^2 + sigma2^2));
display(['\mu_{12} = ' num2str(mu12) ', \sigma_{12} = ' num2str(sigma12)] )
f1 = 1/sqrt(2*pi*sigma1^2) * exp( -(X-mu1).^2/(2*sigma1^2) );
f2 = 1/sqrt(2*pi*sigma2^2) * exp( -(X-mu2).^2/(2*sigma2^2) );
f12 = 1/sqrt(2*pi*sigma12^2) * exp( -(X-mu12).^2/(2*sigma12^2) );
figure;
plot(X,f12,X,f1,'g--',X,f2,'g--');axis([-10 10 0 .4])
xlabel('X');ylabel('Probability distribution')
\mu_{12} = 0.66, \sigma_{12} = 1.7889
and
$$ \mu_{12} = \frac{\sigma_1^2 \mu_2 + \sigma_2^2 \mu_1}{\sigma_1^2+\sigma_2^2}=0 . $$clc;close all;clear all
X = -10:0.01:10;
mu1 = -4 ; sigma1 = 2;mu2 = 4 ; sigma2 = 2;
mu12 = (sigma1^2*mu2 + sigma2^2*mu1)/(sigma1^2 + sigma2^2);
sigma12 = sqrt((sigma1^2*sigma2^2)/(sigma1^2 + sigma2^2));
display(['\mu_{12} = ' num2str(mu12) ', \sigma_{12} = ' num2str(sigma12)] )
f1 = 1/sqrt(2*pi*sigma1^2) * exp( -(X-mu1).^2/(2*sigma1^2) );
f2 = 1/sqrt(2*pi*sigma2^2) * exp( -(X-mu2).^2/(2*sigma2^2) );
f12 = 1/sqrt(2*pi*sigma12^2) * exp( -(X-mu12).^2/(2*sigma12^2) );
figure;
plot(X,f12,X,f1,'g--',X,f2,'g--');axis([-10 10 0 .4])
xlabel('X');ylabel('Probability distribution')
\mu_{12} = 0, \sigma_{12} = 1.4142
Multivariate distributions involve multiplication of multiple gaussian distributions,
In the special case when \( K = 2 \), we have
$$ \mathbf{\Sigma} = \left[ \begin{array}{cc} \sigma_X^2 & \sigma_{X,Y} \\ \sigma_{X,Y} & \sigma_Y^2 \end{array} \right] $$where \( \sigma_{X,Y} \) is the second moment between x- and y- variables.
Reframe linear regression problem as follows,