Conditional Expectations and Variances

Here we look at an important concept that is an extension from Bayes Theorem, which we discussed briefly.

The condition expectation identity says \mathbb[X] = \mathbb[\mathbb[X|Y]]
The condition variance identity says Var(X) = Var(\mathbb[X|y]) + \mathbb[Var(X|Y)]

Here both \mathbb{E}[X|Y] and Var(X|Y) are both functions of Y and are therefore random variables themselves.

With this, we start by considering a random sum of random variables. Let W= X_1 + X_2 + \ldots + X_N where X_i‘s are IID with mean \mu_x and variance {\sigma_x}^2, where N is also a random variable, independent of X_i‘s.

\mathbb{E}[W]
= \mathbb{E} [\mathbb{E}[\sum_{i=1}^N x_i | N]]
= \mathbb{E} [N \mu_x]
= \mu_x \mathbb{E}[N]

Var(W)
= Var(\mathbb{E}[W|N]) + \mathbb{E}[Var(W|N)]
= Var(\mu_x N) + \mathbb{E}[N {\sigma_x}^2]
= {\mu_x}^2 Var(N) + {\sigma_x}^2 \mathbb{E}[N]

Showing 2 comments
    pingbacks / trackbacks

    Leave a Reply to Introduction to Brownian Motion – The Culture Cancel reply

    Contact Us

    CONTACT US We would love to hear from you. Contact us, or simply hit our personal page for more contact information

    Not readable? Change text. captcha txt
    0

    Start typing and press Enter to search