Regression without intercept: deriving ˆβ1\hat{\beta}_1 in least squares (no matrices)

In An Introduction to Statistical Learning (James et al.), in section 3.7 exercise 5, it states that the formula for ˆβ1 assuming linear regression without an intercept is
ˆβ1=ni=1xiyini=1x2i,
where ˆβ0=ˉyˆβ1ˉx and ˆβ1=SxySxx are the usual estimates under OLS for simple linear regression (Sxy=ni=1(xiˉx)(yiˉy)).

This is not the actual exercise; I am merely wondering how to derive the equation. Without using matrix algebra, how do I derive it?

My attempt: with ˆβ0=0, we have ˆβ1=ˉyˉx=SxySxx.

After some algebra, it can be shown that Sxy=ni=1x2inˉxˉy and Sxx=ni=1x2inˉx2. From here, I’m stuck.

Answer

This is straightforward from the Ordinary Least Squares definition. If there is no intercept, one is minimizing R(\beta) = \sum_{i=1}^{i=n} (y_i- \beta x_i)^2. This is smooth as a function of \beta, so all minima (or maxima) occur when the derivative is zero. Differentiating with respect to \beta we get -\sum_{i=1}^{i=n} 2(y_i- \beta x_i)x_i. Solving for \beta gives the formula.

Attribution
Source : Link , Question Author : Clarinetist , Answer Author : Glen_b

Leave a Comment