Exercise Session 12, Solutions, December 1st ,2006 Mathematics for Economics and Finance Prof: Norman Schürho¤ TAs: Zhihua Chen (Cissy), Natalia Guseva Problem 1 OLS estimator in …nite sample. Recall b = (X 0 X) 1. Find the sampling error. (Hint: b
1
X 0Y
)
2. Show OLS estimator is unbiased. (Hint: E(b j X) = ) 3. Find the variance of b for given X: Solution 1. The sampling error: = (X 0 X) 1 X 0 y = (X 0 X) 1 X 0 (X + ") = (X 0 X) 1 X 0 X + (X 0 X) {z } |
b
1
X 0"
=In
=
(X 0 X)
1
X 0"
2. To show E(b j X) = is equivalent to show E(b set A = (X 0 X) 1 X 0 ;then E(b A is X m easurable
=
AE("
j X) = 0;
j X) = E(A" j X) j X) = 0 by Assumption E(" j X) = 0
3. V ar(b
j = = =
X) = V ar(b j X)....since is true value, V ar( ) = 0; Cov( ; b) = 0: V ar(A" j X) A is X measurable 0 AV ar(" j X)A ......Notes: E(" j X) = 0 AE(""0 j X)A0 0
= A 2 In A0 ......By assumption E("" j X) = 2 = (X 0 X) 1 X 0 [(X 0 X) 1 X 0 ]0 2 = (X 0 X) 1 X 0 X [(X 0 X)0 ] 1 | {z } In
=
2
0
(X X)
1
1
2
In homoskedasticity
Problem 2 Let ( ; F; P) be a probability space. There exists two random variables X and Y . If we can observe X,what we can Rsay about Y ? One way is to 2 use mean square error, M SE = E(Y g(X))2 = fY (!) g(X(!))g dP (!). Show that g (X) = E(Y j X) is the best solution. Solution = E[Y g(X)]2 = E[Y E(Y j X) + E(Y j X) g(X)]2 = E[Y E(Y j X)]2 + E[E(Y j X) g(X)]2 +2Ef[Y E(Y j X)][E(Y j X) g(X)]g {z } |
M SE
(3)
(3)
= 2EfE[(Y
E(Y j X))
(E(Y j X) | {z
g(X)) }
j X]g
X m easurable, take out one exp ectation
= 2Ef(E(Y j X) = 2Ef(E(Y j X)
g(X))E[(Y E(Y j X)) j X]g g(X))(E(Y j X) E(Y j X))g {z } | =0
=
0
) M SE
=
E[Y
E(Y j X)]2 + E[E(Y j X) | {z 0
)
E[Y
2
g(X)]
E[Y
E(Y j X)]2
g(X)]2 }
which implies g (X) = E(Y j X) minimizes MSE. Problem 3 Variable X is normally distributed with mean
2
and variance
.
1. Assume 2 = 80: The observed value of the sample mean X of a random sample of size 20 is 81:2. Find a 95% con…dence interval for . 2
2. Assume mately. Solution if X s N ( ;
= 9:Find n such that P r(X
2
) then X s N ( ;
1. CI95% ( ) = (X p
F
1
2
n
1<
< X + 1) = 0:9 approxi-
)
(0:975) pn ; X + F
1
p
1:96 p80 ; 20
(0:975) pn ) = (81:2
81:2 + 1:96 p80 ) = (77:28; 85:12) 20 2. Pr(X
1 <
< X + 1) = 0:9 =) CI90% ( ) = (X
1
F
1
(0:95) pn ;
X + F (0:95) pn ) = (X 1; X + 1) =) p F 1 (0:95) pn = 1 =) n = F 1 (0:95) = 1:645 3 = 4:935 =) n t 24 2
Problem 4 Show for any two random variables X and Y , V arX = E(V ar(X j Y )) + V ar(E(X j Y )): By de…nition, we have
Solution V arX
= E(X EX)2 = E[X E(X j Y ) + E(X j Y ) | {z } add and subtract 2
= Ef(X +2 (X = E (X |
= 2EfE[(X
2
E(X j Y )) + (E(X j Y ) EX) E(X j Y )) (E(X j Y ) EX)g 2
E(X j Y )) + E (E(X j Y ) {z } | {z (1)
+2E[(X |
(3)
EX]2
(2)
E(X j Y )) (E(X j Y ) {z (3)
E(X j Y ))
(E(X j Y ) | {z
2
EX) }
EX)]...by linearity property }
EX) }
j Y ]g by law of iterated exp.
EX cons.& E(XjY ) Y m easable
=
2Ef(E(X j Y )
=
2Ef(E(X j Y )
EX) E[(X |
E(X j Y )) j Y ]g Take out what is known {z }
from now on, fo cus this part
EX) [E(X j Y )
E(E(X j Y ) j Y )] | {z }
g...by linearity property
=E(XjY ) By taking out what is known
=
2Ef(E(X j Y )
=
0:
EX) [E(X j Y ) E(X j Y )]g | {z } =0
(1)
= E[E (X
2
E(X j Y )) j Y ]....by law of iterated expectation 2
= EfE[X 2 + (E(X j Y )) 2
2XE(X j Y ) j Y ]g
= E[E(X j Y ) + (E(X j Y ))2
2
2
2 (E(X j Y )) ]...by linearity & take out known
= E[E(X 2 j Y ) (E(X j Y )) ] = E[V ar(X j Y )] (2) = E[E 2 (X j Y ) + E 2 (X) 2E(X)E(X j Y )] = E(E 2 (X j Y )) + EE 2 (X) 2E(X)E(E(X j Y )) | {z } | {z } =E 2 X
2
= E(E (X j Y ))
=EX
2
(E(E(X j Y ))) = V ar(E(X j Y )):
) V arX = E(V ar(X j Y )) + V ar(E(X j Y )):
3
Problem 5 Suppose the distribution of Y conditional on X = x is N (x; x2 ) and the marginal distribution of X is uniform (0; 1):Find EY; V arY; and Cov(X; Y ): Solution Notes the pdf for X is fX (x) =
1 1 0
= 1;because X is uniform in (0; 1):
law of iterated exp.
EY
=
E(E(Y j X)) | {z } =X
= EX =
Z1
x fX (x)dx =
0
V arY
Z1
x 1dx =
x2 1 1 j = 2 0 2
0
= E(V ar(Y j X)) + V ar(E(Y j X)) use Q3 result | | {z } {z } X2
X
2
= EX + V arX 2 = EX 2 + EX 2 (EX) 2
2EX 2 (EX) Z1 = 2 x2 fX (x)dx
=
1 2
2
0
=
Cov(X; Y )
2
x3 1 j 3 0
1 5 = 4 12
= E(XY )
EXEY
x
1 1 ...law of iterated expectation 2 2
= E(E(XY j X)) = EXE(Y j X) | {z }
1 ......take out what is known 4
=X
= EX =
Z1
2
1 4
x2 fX (x)dx
1 1 = 4 3
0
4
1 1 = : 4 12