7. Classical linear model assumptions for time series Consider the following stochastic process {(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n}{(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n} that follows the linear model: y=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+uy=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+u {(ut: t =1, 2, . . . , n)}{(ut: t =1, 2, . . . , n)} = sequence of error terms nn = number of observations (time periods) What are the minimum Gauss–Markov assumptions needed for the OLS estimates of βˆjβ^j , for j = 1, 2, . . . , kj = 1, 2, . . . , k , to be the best linear unbiased estimators (BLUE) conditional on the explanatory variables for all time periods ( XX )? Check all that apply.