Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Econ 120B, Fall03 Homework Solutions: Regression Analysis and Multicollinearity, Assignments of Electrical and Electronics Engineering

Solutions to part ii of homework #2 for econ 120b, fall03. It includes derivations and explanations for questions related to two-variable regression models, constant terms, degrees of freedom, multicollinearity, and irrelevant variables.

Typology: Assignments

2009/2010

Uploaded on 03/28/2010

koofers-user-cps
koofers-user-cps 🇺🇸

10 documents

1 / 1

Toggle sidebar

Related documents


Partial preview of the text

Download Econ 120B, Fall03 Homework Solutions: Regression Analysis and Multicollinearity and more Assignments Electrical and Electronics Engineering in PDF only on Docsity! Econ 120B, Fall03, Homework #2, Part II 1. In a two-variable regression model, suppose you unnecessarily included a constant term. In other words, the true model is Yt = Xt + ut, whereas you estimated the model Yt =  + Xt + ut. Derive the expected value of the OLS estimator of  given in Section 3.2. Is the estimator biased or not? If yes, state the condition(s) under which it might become unbiased. 2. “In testing hypotheses on several linear combinations of regression coefficients, the difference in the degrees of freedom between the restricted and unrestricted models is the same as the number of restrictions.” Do you agree with this statement? If yes, prove it with an example, and, if not, give a counterexample. Indicate whether each of the following statements is justified and explain your reasons: 3. “Because multicollinearity lowers t-statistics, all the variables with insignificant regression coefficients should be dropped from the model in one swoop because they are redundant.” 4. “Multicollinearity raises the standard errors of regression coefficients and hence t- and F-tests are invalid.” 5. “If there is multicollinearity among independent variables, then a variable that appears significant may not indeed be so.” 6. “High multicollinearity affects standard errors of estimated coefficients and therefore estimates are not efficient.” 7. “Adding an irrelevant variable (that is, one that has a truly zero coefficient) to a model has the same effect as high multicollinearity when it comes to the properties of unbiasedness, consistency, and efficiency of the OLS estimators of parameters.” Carefully explain whether this statement is valid. If it is partially valid, indicate which parts are.