At Westonci.ca, we connect you with the answers you need, thanks to our active and informed community. Discover comprehensive solutions to your questions from a wide network of experts on our user-friendly platform. Get detailed and accurate answers to your questions from a dedicated community of experts on our Q&A platform.

7. Classical linear model assumptions for time series Consider the following stochastic process {(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n}{(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n} that follows the linear model: y=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+uy=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+u {(ut: t =1, 2, . . . , n)}{(ut: t =1, 2, . . . , n)} = sequence of error terms nn = number of observations (time periods) What are the minimum Gauss–Markov assumptions needed for the OLS estimates of βˆjβ^j , for j = 1, 2, . . . , kj = 1, 2, . . . , k , to be the best linear unbiased estimators (BLUE) conditional on the explanatory variables for all time periods ( XX )? Check all that apply.

Sagot :

Answer:

Options are missing.

The options for the above question are:

TS.1: Linear in parameters.

TS.2:No perfect collinearity

TS.3: Zero conditional mean.

TS.4: Homoskedasticity.

TS.5: No serial correlation

TS.6: Normality.

Hence the correct answer is TS1 to TS 5

Step-by-step explanation:

Assumptions TS 1 to TS 5 are the minimum set of assumptions needed to for the OLS estimates to be the best linear unbiased estimators conditional on explanatory variables for all time periods.

The assumptions of Normality is not needed for the estimators to show the BLUE property