Mathematics for Economics and Finance Practice Questions III. Solutions. Instructor: Norman Schürho¤ Teaching Assistants: Zhihua (Cissy) Chen, Natalia Guseva Session: Fall 2006 Date: December 4, 2006 Problem 1 Suppose a monopolist’s marginal revenue is 6 2x where x denotes sales. Find the total revenue and the demand function faced by the monopolist. R Total revenue: R = (6
Solution
2x)dx = 6x
2
2 x2 = 6x
Demand function: Total Revenue= sales price(sales)=6x =)Price(sales) = 6
x =) Demand(p) = 6
x2 : x2 = x(6
x)
p
Problem 2 Three machines A, B, C produce resp. 50%; 30%; 20% of the total number of items of a factory. The percentages of defective items produced by the machines A, B, C are resp. 3%; 4%; and 5%: 1. If an item is selected at random, …nd the probability that the item is defective. 2. Suppose an item is selected at random and is found to be defective. Find the prob. that the item was produced by machine A Solution P (D)
= = = =
P (D \ A) + P (D \ B) + P (D \ C) P (D j A)P (A) + P (D j B)P (B) + P (D j C)P (C) 3% 50% + 4% 30% + 5% 20% 3:7% P (D j A)P (A) P (A j D) = P (D j A)P (A) + P (D j B)P (B) + P (D j C)P (C) = 40:54%:
1
Problem 3 Pick a card from a deck of 52. Let A be the event that “the card is an ace” and let B be the event that “the card is a spade”. Are events A and B independent? 1 4 = 13 ; P (B) = P (\the card Solution: P (A) = P (\the card is an ace") = 52 1 is a spade") = 4 1 P (A \ B) = P (\the card is a spade ace") = 52 ; P (A \ B) = P (A)P (B): =) events are independent
Problem 4 The pdf of normal distribution with mean and variance given by: (x )2 1 e 2 2 ; 1<x<1 f (x; ; 2 ) = p 2 1. If X 2. Show
N( ; 1 R
1
p1 2
2
e
), show the random variable Z = z2 2
X
Solution (1) z) = P ( = P (X p
=
Set t =
x
; then dt =
1
X
z)
z + ) z Z+
1 2
e
(x )2 2 2
dx
1
dx; ) =
Zz
1
1 p e 2
t2 2
dt
Which is a standard normal cdf. E(Z)
=
+1 Z 1 zp e 2 1
Z1
=
1 p 2
=
1 p e 2 2
z2 2
dz z2 2
0
e
1
z2 2
j11 = 0
z2 2
is
is standard normal;
dz = 1
P (Z
2
dz
V ar(Z)
Z1
1 p 2
=
1 Z1
1 p 2
=
z2 2
z2e
z2 2
j11
L 0 Hopital Rule
=
(2) Show
0+1=1
1 R
1
p1 2
z2 2
e
dz
dz
1
1 p ze = 2 | {z
z2 2
E(Z))2 e
(z
+
}
Z1
1 p e 2 1 | {z
z2 2
dz
(Integration by parts)
}
Shown in (2)
dz = 1
In Exercise Session 9, Exercise 1, point (b) we showed that
1 R
e
z2 2
dz =
0
z2 2
Function e 1 R
1
p1 2
e
R0
is symmetric, hence
z2 2
p1 2
dz =
e
1
R0
e
z2 2
dz +
1
z2 2
dz =
1 R
e
z2 2
0
p
dz
p
2:
2
!
=
p1 2
p
2
+
p
2
=1
Problem 5 Standard quantitative models of the stock market assume stock return follows a log-normal distribution. If log X
N( ;
2
); 0 < X < 1;
1<
< 1;
2
> 0:
1. Find the pdf for X 2. Compute E(X); V ar(X) Solution 1) Set Y = g(X) = log(X); then X = g
=
d dy FY
d dx FX (x)
=
d dx FY
(g(x))
(Y );we have:
Increasing M onotone Transform
FX (x) = P (X x) = P (g 1 (Y ) = P (g g 1 (Y ) g(x)) = P (Y For pdf: fX (x) =
1
x) = g(x)) = FY (g(x)): Chain Rule
d (g(x)) dx (g(x)) = fY (g(x)
=
1 x
3
=
p 1 e 2 x
(log x )2 2 2
:
(2) = E(elog X ) = E(eY )
E(X)
= e V ar(X)
2 2
+
2
= EX 2
(EX) 2
= Eelog X
2
= Ee2Y = e2
+2
2
(EX) (EX)
2
e2
+
2
Problem 6 Let f (x) = 1=3 for 1 < x < 2 and zero elsewhere be the pdf for a random variable X: Find the pdf and cdf for the random variable Y = X 2 : Solution The cdf is given by FX (x) = y < 0; FY (y) = 0 y 2 [0; 1) ; FY (y)
Rx
1 du 1 3
= 13 (x + 1):
= P (Y y) = P (X 2 y) p p y X y) = P( p p = FX ( y) FX ( y) 1 p 1 p ( y + 1) ( y + 1) = 3 3 2p = y 3
y 2 [1; 4) ; FY (y)
= P (Y y) = P (X 2 p = P( 1 X y) | {z } with y2[1;4);jxj 1
p = FX ( y)
FX ( 1) | {z } 0
=
1 p ( y + 1) 3
4
y)
Summary:
FY (y) =
8 > > < > > :
0
1 3
2p 3 y p
y+1 1
8 y<0 < y 2 [0; 1) ; fY (y) = y 2 [1; 4) : y 2 [4; 1)
1p 3 py 1 6 y
0
y 2 [0; 1) y 2 [1; 4] otherwise
8 > > <
0 1=2 Problem 7 Suppose that the random variable X has cdf FX (x) = > (x + 1) =2 > : 1 Is X a discrete or continuous variable? Calculate the E[X] and V ar(X): Solution Note that the cdf is a mixture of a continuous and a step function, thus it’s neither continuous nor discrete r.v. The pdf is given by
R1
8 x=0 < 1=2 1=2 0 < x 1 fX (x) = : 0 otherwise
x 21 dx = 1=4: R1 2 V ar(X) = 1=2 0 + 0 x2 21 dx (1=4) = 5=48: E(X) = 1=2 0 +
0
Problem 8 Consider the pdf de…ned by fX (x) = Show: (a) it’s correctly de…ned (b) its expected value (c) its variance. Solution
R1 0; 1 x23 dx = 1: R1 b). E(X) = 1 x x23 dx = 2:
a). fX (x)
c). V ar(X) = 1:
5
2 x3 ;
x > 1 and zero elsewhere.
if if if if
x<0 x=0 0<x 1 x>1
Problem 9 Let X and Y be independent random variables with means X ; Y and variances 2X ; 2Y : Find an expression for the correlation of XY and Y in terms of these means and variances. Solution 2 E(XY )E(Y ) (X;Y are indep endent) E(X)E(Y 2 ) E(X)(E(Y ))2 ) = E(XY ) XY = corr(XY; Y ) = cov(XY;Y XY Y Y XY Y To get the correlation of XY and Y; we need to calculate E(Y 2 ); E(X 2 ) and XY : 2 2 E(Y 2 ) = V arY + (EY ) = 2Y + 2Y ; E(X 2 ) = V arX + (EX) = 2X + 2X ;
XY
q p 2 V ar(XY ) = E(XY )2 (E(XY )) q 2 E(X 2 Y 2 ) (EX) (EY )2 q 2 E(X 2 )E(Y 2 ) (EX) (EY )2 (X,Y independent) q 2 2 ( 2X + 2X ) ( 2Y + 2Y ) X Y q 2 2 + 2 2 + 2 2 X Y X Y Y X
= = = = =
Therefore, corr(XY; Y ) =
Y
pX
2 2 Y + Y 2 2 + 2 X X Y
X 2 Y
+
2 Y 2 Y
2 X
=p
2 X
2 Y
X Y 2 2 X Y
+
+
Problem 10 A median of a distribution is a value m such that P (X and P (X m) 12 : If X is continuous, m satis…es
m R
f (x)dx =
1
Show that minE jX
aj = E jX
a
Solution E jX
aj =
1 R
1
jx
aj f (x)dx =
f (x)dx =
m
mj :
Ra
(x
a)f (x)dx +
1
Applying FOC we get d E jX da
1 R
aj
=
Za
f (x)dx
1
) a=m 6
a
2 X
m)
1 2
1 R
(x
a
Z1
2 Y
f (x)dx = 0
a)f (x)dx
1 2
Which can be also con…rmed by d2 E jX da2
aj = f (a) + f (a) = 2f (a)
Problem 11 Show that if (X; Y ) the following are true:
Bivariate normal (
0
X;
2 X;
;
Y
2 Y
; ), then
(1) The marginal distribution of X is N ( X ; 2X );and the marginal distribution of Y is N ( Y ; 2Y ); Y )(x (2) The conditional distribution of Y given X = x is N ( Y + ( X 2 2 ); (1 )): X Y Solution joint pdf for Bivariate normal: fXY (x; y)
1 p 2 X Y 1 1 expf 2(1
=
(1) fX (x) =
1 R
2
2)
[(
x
X 2
)
fX (x)
=
1
2
X
1
expf =
Z1
1
=
=
=
=
1 p 1 X
2
!2 2(1
exp(
p 1 X
2
2 exp( p 2
X 2
! 2
X
2)
Y
x
X X
; and z =
)+(
y
Y
Y y
Y Y
)2 ]g
Y
; so dy =
Y
dz
X 2
)
2 (
x
X
1 )Z
expf
expf
X
)(
y
Y
X
1 2)
2(1 1
2)
2(1
)+(
y
Y
Y
)2 ]gdy
Y
[! 2
2 !z + z 2 ]gdz
[z 2
2 !z + ( !)2
( !)2 ]gdz
1
2)
+
Z1
1
expf
2)
2(1
[z 2
2 !z + ( !)2 ]gdz
1
p
1|
( !)2 Z1 2(1 2 ) ) 2
1
exp[ X
x
2
p
|
1 2
)
[(
2
!2 2(1
exp(
p
2)
2(1
y
2
1
Y
)(
X
fXY (x; y)dy: Set ! = 1 p
X
X
1
Z1
x
2 (
1 p 2 1
expf
2
p df for r.v
1 2(1 {z
2)
N ( !;(1
{z
=1
1 x ( 2
X 2
) ]
X
7
N(
X;
2 X)
(z 2)
!)2 ]gdz }
}
(2) fY jX (y
j = =
x) = 2
X
fXY (x; y) fX (x) 1p expf 2 Y
1
= )
[( x
1 p 2 2 Y 1 1 x expf [( 2) 2(1 x y X 2 ( )(
X X
2 (x
)2
exp[
1 x 2(
(1
2
X
X X
X X
)( y
Y Y
) + (y
Y Y
)2 ]
p
X 2
)
)(
x
X Y
X 2
)
X
)+(
y
Y
Y
)2 ]
Y
1 p 2 2 Y 1 1 x X 2 expf [ 2( ) 2) 2(1 X x y y X Y 2 ( )( )+( p
X
=
2)
p 1 2
X
=
1 2(1
p p
2 2 N(
Y
1 p 1 Y 1 p 1 Y Y
+ (
2
2 Y
Y
)2 ]g
Y
expf expf )(x
X
1 2)
2(1
[
y Y
1 2) 2 Y
2(1 X );
2 Y
X 2
] g
X
[(y 2
(1
Y
Y
)
(x
X
2 X )] g
))
Problem 12 Take the regression model y = X
n KK 1
n 1
x
Y
+ " and assume that n 1
it ful…lls the main assumptions of the linear regression model. Furthermore, assume that y=X N (X ; 2 In ) (a). (b). (c). trix (d).
Write the log likelihood function Find the MLE estimators for and 2 Compute the Cramer-Rao lower bound and the Fisher Information maFind the asymptotic distribution of the estimators.
Solution (a). The likelihood function is given by L(Y =X; ) = 2
2
n=2
exp
8
1 2
2
(Y
X )0 (Y
X )
)2 ]g
The log likelihood function is then n ln 2 2
lnL =
n ln 2
1
2
X )0 (Y {z
(Y
2|
2
Y
0Y
0X0Y
2
(b) The score vector is de…ned by the FOC. Let @ ln L = @
+
=
X ) }
0X0X
2
2X 0 Y + X 0 X = 0 ) ^ M LE = (X 0 X)
1
X 0 Y:
Replace this result in the FOC( ) @ ln L @
= ) )
n n + 2 (Y 2 2 n n + 2 (Y 2 2 |
^ M LE
X )0 (Y
X )=0 0
X ^ M LE ) = 0
X ^ M LE ) (Y {z } e
1 = e0 e n
Note that the MLE estimator for the is the OLS estimator, however, the ^ M LE is di¤erent from the OLS estimator (check its small sample properties, recall ^ OLS = n 1 k e0 e ). The SOC are: @ ln L @ @ 0 @ 2 ln L @ 2 @ ln L @ @
1
=
X 0X
n 2 2 1
= =
2
n 0 "" 2 3 X 0"
(c). Thus, the Fisher Information Matrix is given by 1
I( ; )
=
E 1
=
X 0X 0 2" X
1
X 0X 0
I( ; )
2
X 0" n 0 2 3" "
0 n 2 2
The Cramér-Rao lower bound is then " 1
1 n 2 2
(X 0 X) 0
=
2
=
9
(X 0 X) 0
1
0 2 2 n 1
0 2 4 n
#
d). The asymptotic distribution is given by ^
2
d
M LE ^ 2M LE
! N (0;
2
(X 0 X) 0
1
0 2 4 n
):
Problem 13 Let X1 ; :::; Xn represent a random sample from a normal distribution with mean and variance 1 Pn 1 MLE estimator and show (a). Show that ^M pLE = Xn = n i=1 Xi is the ) is equal to I( ) 1 = 1 that the variance of n(Xn p p (b). Calculate the probability of the event ^M LE 1:96 I( ) 1 = n < < p p ^M LE + 1:96 I( ) 1 = n: Solution Let X1 ; :::; Xn be sampled from N ( ; 1); with
= :
a) The log likelihood is lnL( =X)
n
n ln 2 2
=
n
We know that
p
1X (Xi 2 i=1
1X ) ^= Xi n i=1 n ^
d
! N (0; I( )
p var( n ^
)
1
)
= n var(^) n
= n var( = =
1X Xi ) n i=1
n n X varXi n2 i=1 n n=1 n2
b) = P(^ = P(
1:96 1:96 p < <^+ p ) n n p 1:96 < n ^ < 1:96)
= 2 (1:96) = 0:95
1
10
2
)
Problem 14 Let X1 ; :::; Xn represent a random sample from N ( ; ML estimator b and prove its consistency. Solution
1
`n = const n ln j j
2
= const
1
F OC :
n ln j j @`n @
=
2
2
2
n X
(xi
i=1 n X
x2i +
i=1
n
+
) = const n ln j j
n X
x2i
i=1 3
1
n X
2
2
+ x
2
): Derive
x2i
2
i=1
n X
xi +
i=1
n 2
xi
i=1
n X
xi
i=1 2
= 0 =)
n X
n X
x2i
i=1
=)
n X
1
2
2
xi
n
2
=0
i=1
x2 = 0
Equation has two solutions: p 1 x + x2 + 4x2 ; 1 = 2
1 2
> 0 and =
1 2
Note that `n is a symmetric function of
2
x
<0 p x2 + 4x2
except the term
1
determines the solution. If x > 0 then the global maximum of otherwise in 2: p So, bM L = 12 x + sgn( ) x2 + 4x2 :
n X
xi : This term i=1 `n will be in 1 ;
p
This estimator is consistent, because if 6= 0; sgn(x) ! sgn( ) and p p p 2+8 bM L ! 21 Ex + sgn(Ex) (Ex)2 + 4Ex2 = 21 + sgn( )
2
p
bM L !
Problem 15 Y is a random variable that denotes the number of dots obtained when a fair six sided die is rolled. Let: X=
Y; if Y is even 0; otherwise
(i) Find joint distribution of (X; Y ) (ii) Find the best predictor of Y jX (iii) Find Best Linear Predictor of Y conditional on X (iv) Calculate Mean Squared Prediction Errors for cases (ii) and (iii).
11
n X i=1
2
!
=
Solution (i) Joint distribution is 8 (0; 1) with > > > > (2; 2) with > > < (0; 3) with (X; Y ) = (4; 4) with > > > > (0; 5) with > > : (6; 6) with
1 6 1 6 1 6 1 6 1 6 1 6
prob prob prob prob prob prob
(ii) Best Predictor is just a conditional expectation 8 3 X=0 > > < 2 X=2 E[Y jX] = 4 X=4 > > : 5 X=6
(iii) Best Linear Predictor is an OLS estimator: BLP [Y jX] = =
Cov(X;Y ) V ar(X)
=
E[XY ] E[X]E[Y ] ; V ar(X)
= E[Y ] 1 6 +4
E[X] = 2; E[Y ] = 72 ; E[XY ] = 0 V ar(X) = =)
=
E[X]
1 6 +0
1 6 +16
1 6 +0
1 6 +36
1 6
=
28 3 ;
16 3
E[XY ] E[X]E[Y ] V ar(X)
BLP [Y jX] =
21 8
+
(iv) Mean Squared 8 < 2 0 Errorsii = : 2
=
28 3
2 16 3
7 2
=
7 16 ;
=
7 2
7 16 2
=
21 8
7 16 X
Prediction Error for case (ii) with prob. with prob. with prob.
=) M SEii = E[Errors2ii ] = 4
1 6
+ X
1 6 2 3 1 6 1 6
+0
2 3
+4
Mean Squared Prediction Error for case (iii) 8 13 with prob. 61 > 8 > > 12 > with prob. 61 > > < 38 with prob. 61 8 Errorsiii = 3 with prob. 61 > 8 > > 19 > with prob. 61 > > : 68 with prob. 61 8
=) M SEiii = E[Errors2iii ] = ( 1 6 2 1 2 + ( 19 8 ) 6 + (8) 6 t 1:9
13 2 8 )
12
1 6
+(
1 6
=
12 2 8 )
4 3
1 6
+ ( 83 )2
1 6
+(
3 2 8)