PS02

Due date: April 23 @ 11:59p

Instructions

Complete the following questions below and show all work. You may either type or hand write your answers. However, you must submit your problem set to Canvas as an html or pdf. Meaning any handwritten solutions are to be scanned and uploaded. The onus is yours to deliver an organized, clear, and/or legible submission of the correct file type that loads properly on Canvas.1 Double-check your submissions!

Integrity: If you are suspected of cheating, you will receive a zero—for the assignment and possibly for the course. Cheating includes copying work from your classmates, from the internet, and from previous problem sets. You are encouraged to work with your peers, but everyone must submit their own answers. Remember, the problem sets are designed to help you study for the midterm. By cheating, you do yourself a disservice.

Questions

001. Simple Linear Regression without an Intercept

In a simple linear regression model, the intercept term represents the expected value of the dependent variable when the independent variable is zero. However, in some cases, it might not make sense to have an intercept or assume that the dependent variable has a non-zero value when the independent variable is zero. In such situations, a simple linear regression without an intercept might be more appropriate. Suppose we have the following simple linear regression model without an intercept:

\[ Y_i = \beta X_i + u_i \]

where the corresponding residuals are written as:

\[ \hat{u} = Y_i - {\beta} X_i \]

In this question, we will derive the OLS estimate of \(\beta\).2

a. Describe in words what the objective of the OLS estimator is and how the first order condition reaches that objective.

b. Set up and solve the first order condition. (i.e. Find \(\frac{\partial \text{RSS}}{\partial \hat{\beta}}=0\))

c. Solve for the simple OLS estimator (i.e., Solve for \(\beta\) from the first order condition above).



002. OLS Estimation

Suppose we have the following data on a dependent variable (Y) and an independent variable (X):

Observation Number Y X
1 10 12
2 2 14
3 6 16

We wish to estimate the simple linear regression model:

\[ Y_i = \beta_1 + \beta_2X_i + u_i \]

a.Calculate the OLS estimates \(\hat{\beta}_1\) and \(\hat{\beta}_2\). Show your work.

b. Calculate the \(R^2\) for this regression. Show your work.




003. Interpretation of OLS Estimates

Suppose you are trying to figure out how to make cars more fuel efficient, so you run a regression of miles per gallon on weight to estimate the effect of weight on MPG.

\[ mpg_i = \beta_0 + \beta_1 wt_i + u_i \] You run a regression, and get the following output:

# A tibble: 2 × 5
  term        estimate std.error statistic  p.value
  <chr>          <dbl>     <dbl>     <dbl>    <dbl>
1 (Intercept)    37.3      1.88      19.9  8.24e-19
2 wt             -5.34     0.559     -9.56 1.29e-10

a. Give the interpretation of the two estimates.

b. The regression in a. produced an R-squared of \(0.75\). What does this value tell us?

Footnotes

  1. Do not simply change the file type and submit (eg edit name of document from .jpg \(\rightarrow\) .pdf)↩︎

  2. Hint: This derivation follows the derivation of OLS with an intercept—with a lot less algebra. The result will look different yet be functionally equivalent of the standard result↩︎