Mathematical expectation & decision making
Published:
This post covers Introduction to probability from Statistics for Engineers and Scientists by William Navidi.
Basic Ideas
Expectation
The mean of $X$ is sometimes called the expectation, or expected value, of $X$ and
may also be denoted by $E(X)$ or by $\mu$.
A conditional expectation is an expectation, or mean, calculated using a conditional probability mass function or conditional probability density function. The conditional expectation of $Y$ given $X = x$ is denoted $E(Y \mid X = x)$ or $\mu_{Y \mid X=x}$
Proof that \(\sigma^2_{aX+b} = a^2\sigma^2_{X}\) We will use the notation $E(X)$ interchangeably with $\mu_{X}$, $E(Y)$ interchangeably with $\mu_{Y}$ , and so forth. Let $Y = aX + b$. Then
- Proof that $\sigma^2_{aX+bY} = a^2\sigma^2_X + b^2\sigma^2_Y + 2ab Cov(X,Y)$
Proof that $E[(X − \mu X)(Y − \mu Y )] = \mu_{XY} − \mu_X\mu_Y$
\(\begin{align*} E[(X − \mu_X)(Y − \mu_Y )] &= E(XY − X\mu_Y − Y\mu_X + \mu_X\mu_Y )\\ &= E(XY) − E(X\mu_Y ) − E(Y\mu_X) + E(\mu_X\mu_Y )\\ &= E(XY) − \mu_YE(X) − \mu_XE(Y) + \mu_X\mu_Y\\ &= \mu_{XY} − \mu_Y\mu_X − \mu_X\mu_Y + \mu_X\mu_Y\\ &= \mu_{XY} − \mu_X\mu_Y\\ \end{align*}\)
Proof that if $X$ and $Y$ are independent then $X$ and $Y$ are uncorrelated
Let X and Y be independent random variables. We will show that $\mu_{XY} = \mu_X \mu_Y$, from which it will follow that $Cov(X,Y) = \rho_{X,Y} = 0$.
We will assume that $X$ and $Y$ are jointly discrete, the joint probability mass function is equal to the product of the marginals: $p(x,y) = p_X(x)p_Y (y)$