Documente Academic
Documente Profesional
Documente Cultură
Rjv2119
Homework 1
1.1 a)
We construct a function that we want to minimize in terms of c with the form:
() = [( )2 ]
= [ 2 2 + 2 ] = [ 2 ] 2[] + 2
We derive this equation with respect to c and equate it to 0 to obtain:
= 2[] + 2 = 0
From which we obtain that the value of c that minimizes the function is:
= []
We can also check the second derivative to see if it is actually a minimum:
2
=2>0
2
1.1 b)
We construct a function that we want to minimize in terms of () with the form:
2
(()) = [( ()) |]
= [ 2 2() + ()2 |] = [ 2 |] 2()[|] + ()2
We derive this equation with respect to () and equate it to 0 to obtain:
= 2[|] + 2() = 0
()
From which we obtain that the value of () that minimizes the function is:
() = [|]
We can also check the second derivative to see if it is actually a minimum:
2
=2>0
2
1.1 c)
We construct a function that we want to minimize in terms of () with the form:
2
= [( ()) |] ()
Using what was demonstrated in the previous literal, we know that [( ()) |] is minimized
when:
() = [|]
Knowing that () is always positive (with range [0,1]) we can clearly see that:
2
[( ()) |] ()
((1 , 2 , , )) = [(+1 (1 , 2 , , )) |1 , 2 , , ]
= [+1 2 |1 , 2 , , ] 2(1 , 2 , , )[+1 |1 , 2 , , ] + (1 , 2 , , )2
We derive this equation with respect to (1 , 2 , , ) and equate it to 0 to obtain:
= 2[+1 |1 , 2 , , ] + 2(1 , 2 , , ) = 0
()
From which we obtain that the value of (1 , 2 , , ) that minimizes the function is:
(1 , 2 , , ) = [+1 |1 , 2 , , ]
We can also check the second derivative to see if it is actually a minimum:
2
=2>0
2
1.2 b)
We construct a function that we want to minimize in terms of (1 , 2 , , ) with the form:
2
((1 , 2 , , )) = [(+1 (1 , 2 , , )) ]
2
= [ [(+1 (1 , 2 , , )) |1 , 2 , , ]]
2
[(+1 (1 , 2 , , )) |1 , 2 , , ] (1 , 2 , , )1 2
=
1 ,2 ,,
[(+1 (1 , 2 , , )) |1 , 2 , , ] (1 , 2 , , )1 2
1 ,2 ,,
=
=1
We want this estimator to be unbiased and therefore we know that this condition has to be met:
[] =
[] = [ ] = [ ] =
=1
=1
=1
= 1
=1
Now we want this estimator to have minimum variance as well because we know by theorem that the
MSE is equal to bias + variance. Therefore, if we construct an estimator that is unbiased and has minimum
variance then we know it is the best estimator. We therefore do:
() = ( ) = 2 ( ) = 2 2
=1
=1
=1
So we know that the estimator will have minimum variance when the sum of =1 2 is minimal. We
therefore construct this minimization problem:
min 2
=1
. = 1
=1
( ) = 2 + ( 1)
=1
=1
We derivate this equation with respect to and and obtain the equations:
=
1
=
=1
1.2 e)
In a previous literal we have showed that
(1 , 2 , , ) = [+1 |1 , 2 , , ] = [+1 ] =
Is the value that minimizes the MSE which is the squared diference between +1 and the estimator. In
previous literal we also obtained that estimator for :
1
=
=1
Is the best linear unbiassed estimator for under iid conditions. We can therefore say that
1
=
=1
[(+1 ( )) | ]
With the above information we perform the following calculations:
2
1.3)
From the strictly stationary definition we select = 1 and = and we obtain:
1 1+
We can therefore see that for any T :
[1 ] = [2 ] = [1+ ]
Therefore the mean is independent of time.
We now select = 2 and any t with = and we obtain:
( , + ) (1 , 1+ )
We can therefore see that:
( , + ) = (1 , 1+ )
And therefore the covariance is independent of t.
Proven this two conditions proves that the process is weakly stationary as well.
1.4 a)
Its easy to obtain the expectation of this process:
[ ] = + [ ] + [2 ] = 0
For the covariance we do:
() = ( + + 2 , + + 2 )
= 2 ( , ) + ( , 2 ) + (2 , ) + 2 (2 , 2 )
2 ( 2 + 2 ),
() = {
2 ,
0,
=0
|| = 2
. .
We can see that both the expected value and the covariance dont depend on t and therefore the
process is weekly stationary.
1.4 b)
Its easy to obtain the expectation of this process:
[ ] = [1 ] cos() + [2 ] sin() = 0
() =
2,
2 sin() cos(( 1)) ,
2 cos() sin(( + 1)) ,
0,
{
=0
=1
= 1
. .
We can see that both the expected value and the covariance dont depend on t and therefore the
process is weekly stationary.
1.4 e)
Its easy to obtain the expectation of this process:
[ ] = [0 ] cos() = 0
For the covariance we do:
() = (0 cos() , 0 cos(( )))
= cos() cos(( )) 2
We can see that if = +
, both the expected value and the covariance dont depend on t and
therefore the process is weakly weekly stationary. If not then the process is not weekly stationary.
1.4 f)
Its easy to obtain the expectation of this process:
[ ] = [ ][1 ] = 0
For the covariance we do:
() = ( 1 , 1 )
() = {
2,
0,
=0
. .
We can see that both the expected value and the covariance dont depend on t and therefore the
process is weekly stationary.
1.5 a)
For this process we have:
1 + 2,
() = {
,
0,
=0
|| = 2
. .
1,
() = {
,
1 + 2
0,
=0
|| = 2
. .
=0
|| = 2
. .
1,
() = {0.488,
0,
=0
|| = 2
. .
1.5 b)
For this process we have:
=1
=1
1
1
1
() = ( 2 ) = 2 ( ( ) + ( , )) =
(4 (0) + 4 (2))
16
1 + 2
=
+
4
4
So we substitute = 0.8 and obtain:
() = 0.61
1.5 c)
So we substitute = 0.8 and obtain:
() = 0.21
We can see that the negative covariance causes the variance of the mean to decrease.
1.6 a)
For this process we have:
2
,
() = { 1 2
|| (0),
=0
0
=1
=1
1
1
() = ( 2 ) = 2 ( ( ) + ( , ))
1
=
(4 (0) + 6(1) + 4(2) + 2(3))
16
So we substitute = 0.9 and 2 = 1 and obtain:
() = 4.6375
1.6 b)
So we substitute = 0.9 and 2 = 1 and obtain:
() = 0.126
We can see that the negative covariance causes the variance of the mean to decrease.
1.7)
{ + } = [ ] + [ ] = +
We can see that the expected value does not depend on time.
() = ( + , + ) = ( , ) + ( , ) = () + ()
We can see that the covariance just depends on the sum of the covariances and only depends on h.
1.7)
{ + } = [ ] + [ ] = +
We can see that the expected value does not depend on time.
() = ( + , + ) = ( , ) + ( , ) = () + ()
We can see that the covariance just depends on the sum of the covariances and only depends on h.
1.8 a)
Using the MGF, we know that if is N(0,1) then the moments are:
[ ] = 0
[ 2 ] = 1
[ 3 ] = 0
Using this we obtain:
0,
[ ] = {
0,
is even
is odd
1
2
([ 3 ] [ ]) = 0
=0
0
And therefore we have proven that this process is white noise (0,1). Now to prove that it is not iid we
are going to use the result in the next literal.
1.8 b)
If n is odd
[+1 |1 , 2 , , ] = [ |1 , 2 , , ] = [ ] = 0
If n is even
[+1 |1 , 2 , , ] = [
1
2
( 2 1)| = ] = [
1
2
( 2 1)| = ] =
= + ( + 1) = 0
2
We therefore express the sample covariance function ( > 0) as:
1
2
( 2 1)
=1
=1
=1
1
1
() = ( + ( ))( + ) = (( + )2 + ( + ))
1
(0) = ( + )2
=1
=1 ( + )
=1 ( + )
+
2
=1( + )
=1( + )2
1
=1
+ )2
=1(
1.9 b)
For making results simpler we select = and n even. We therefore can see that:
1
= cos() = 0
=1
2
() = cos() cos(( + ))
=1
2
2
() = (1) = ( )
=1
2
2
() = (+1) = ( )
=1
Now if = 0 we have
2
2
(0) = (+1) = () = 2
=1
1 ,
() = {
1,
lim () = {
is even
is odd
1,
1,
is even
is odd
1.10)
We define:
= 1
If we define = 0 + 1 + 2 2 +. . . + then:
= 0 ( 0 ( 1)0 ) + 1 (1 ( 1)1 )+. . . + ( ( 1) )
= ( ( 1) )
=0
=0
=1
( 1) = ( ) (1) = + ( ) (1)
=1
=1
= 0 + ( ( ) (1) )
We can clearly see that the inside summation (binomial expansion) starts in 1 and therefore for every
the highest coefficient of t is going to be 1 and therefore for the highest coefficient is going to be
1 . This result maintains recursively and therefore:
+1 = 0
1.11 a)
We can clearly see that given the conditions of the filter:
= 1
=
= 0
=
And therefore:
= (0 + 1 ( )) = 0 + 1 1 = 0 + 1 =
=
1.11 b)
We calculate:
[ ] = [ ] = [ ] = 0
=
=1
2
1
2
( ) = ( ) = 2 ( ) = 2 (
) 1=
2 + 1
2 + 1
1.12 a)
We can express:
= + (( ) ) = + ( ) ()
=0
=1
= + ( ) (1) ()
=1
=1
And from this expression we can see that this is going to be equal to if and only if
=1
= 1
() = 0
For every i.
1.12 b)
We use what is obtained in a) and using R we plug in the following code to check if:
= 1
() = 0
For every i.
j=-7:7
aj=(1/320)*c(-3,-6,-5,3,21,46,67,74,67,46,21,3,-5,-6,-3)
sum(aj)
[1] 1
print(t(j)%*%aj)
[,1]
[1,]
0
j2=j^2
print(t(j2)%*%aj)
[,1]
[1,]
0
j3=j^3
print(t(j3)%*%aj)
[,1]
[1,]
0
1.14)
We use what is obtained in 1.12 a) and using R we plug in the following code to check if:
= 1
() = 0
For every i.
j=-2:2
aj=(1/9)*c(-1,4,3,4,-1)
sum(aj)
[1] 1
print(t(j)%*%aj)
[,1]
[1,]
0
j2=j^2
print(t(j2)%*%aj)
[,1]
[1,]
0
j3=j^3
print(t(j3)%*%aj)
[,1]
[1,]
0
1.15 a)
Using properties of the function and of the seasonality, we can express:
= 12 = 1 + 13 12
[ ] = 0
( , ) = ( 1 + 13 12 , 1 + 13 12 )
= 4() 2( + 1) 2( 1) + ( + 11) + ( 11) 2( + 12)
2( 12) + ( + 13) + ( 13)
We can see that both the expected value and the covariance does not depend on t therefore Zt is
weakly stationary.
1.15 b)
Using properties of the function and of the seasonality, we can express:
2
= 12
= 212 + 24
[ ] = 0
( , ) = ( 212 + 24 , 212 + 24 )
= 6() 4( + 12) + ( + 24) + ( 24)
We can see that both the expected value and the covariance does not depend on t therefore Zt is
weakly stationary.
1.18)
We show the resulting graphs of what is requested in the problem:
Series
10200.
10000.
9800.
9600.
9400.
9200.
9000.
8800.
8600.
8400.
8200.
8000.
0
10
20
30
40
50
60
70
40
50
60
70
Series
600.
400.
200.
0.
-200.
-400.
-600.
10
20
30
============================================
ITSM::(Tests of randomness on residuals)
============================================
Ljung - Box statistic = .20240E+03 Chi-Square ( 20 ), p-value = .00000
2000.
1000.
0.
-1000.
-2000.
10
20
30
40
50
60
70
80
90