champdean
09 มีนาคม 2010, 17:53
Let X be a discrete random variable which takes k possible values from {x1,x2,...,xk}.
Let Y be a discrete random variable which takes j possible values from {y1,y2,...,yj}.
Joint distribution function of X and Y is given by P(x,y)
Q1: Let g(X) and h(y) be functions that depend on X and Y respectively. Prove that
$E[g(X)+h(Y)]$ = $E[g(X)]+E[h(Y)]$
Q2: Let a and b be two constant numbers. Construct a new random variable W as
$$W = aX + bY$$
Prove that
$$\sigma ^2_w= a^2\sigma ^2_x + b^2\sigma ^2_y + 2abCov(X,Y)$$
Where $\sigma ^2_w, \sigma ^2_x$, and $\sigma ^2_y$ are variance of W,X and Y respectively.
โดยที่ $$E[g(X)]=\Sigma g(X)p(x)$$, $$Var(X) = \Sigma (X-\mu )^2p(x)$$ รบกวนด้วยครับ
Let Y be a discrete random variable which takes j possible values from {y1,y2,...,yj}.
Joint distribution function of X and Y is given by P(x,y)
Q1: Let g(X) and h(y) be functions that depend on X and Y respectively. Prove that
$E[g(X)+h(Y)]$ = $E[g(X)]+E[h(Y)]$
Q2: Let a and b be two constant numbers. Construct a new random variable W as
$$W = aX + bY$$
Prove that
$$\sigma ^2_w= a^2\sigma ^2_x + b^2\sigma ^2_y + 2abCov(X,Y)$$
Where $\sigma ^2_w, \sigma ^2_x$, and $\sigma ^2_y$ are variance of W,X and Y respectively.
โดยที่ $$E[g(X)]=\Sigma g(X)p(x)$$, $$Var(X) = \Sigma (X-\mu )^2p(x)$$ รบกวนด้วยครับ