Download Econ 120B, Fall03 Homework Solutions: Regression Analysis and Multicollinearity and more Assignments Electrical and Electronics Engineering in PDF only on Docsity! Econ 120B, Fall03, Homework #2, Part II 1. In a two-variable regression model, suppose you unnecessarily included a constant term. In other words, the true model is Yt = Xt + ut, whereas you estimated the model Yt = + Xt + ut. Derive the expected value of the OLS estimator of given in Section 3.2. Is the estimator biased or not? If yes, state the condition(s) under which it might become unbiased. 2. “In testing hypotheses on several linear combinations of regression coefficients, the difference in the degrees of freedom between the restricted and unrestricted models is the same as the number of restrictions.” Do you agree with this statement? If yes, prove it with an example, and, if not, give a counterexample. Indicate whether each of the following statements is justified and explain your reasons: 3. “Because multicollinearity lowers t-statistics, all the variables with insignificant regression coefficients should be dropped from the model in one swoop because they are redundant.” 4. “Multicollinearity raises the standard errors of regression coefficients and hence t- and F-tests are invalid.” 5. “If there is multicollinearity among independent variables, then a variable that appears significant may not indeed be so.” 6. “High multicollinearity affects standard errors of estimated coefficients and therefore estimates are not efficient.” 7. “Adding an irrelevant variable (that is, one that has a truly zero coefficient) to a model has the same effect as high multicollinearity when it comes to the properties of unbiasedness, consistency, and efficiency of the OLS estimators of parameters.” Carefully explain whether this statement is valid. If it is partially valid, indicate which parts are.