In

An Introduction to Statistical Learning(James et al.), in section 3.7 exercise 5, it states that the formula for ˆβ1 assuminglinear regression without an interceptis

ˆβ1=n∑i=1xiyin∑i=1x2i,

where ˆβ0=ˉy−ˆβ1ˉx and ˆβ1=SxySxx are the usual estimates under OLS for simple linear regression (Sxy=n∑i=1(xi−ˉx)(yi−ˉy)).

This is not the actual exercise; I am merely wondering how to derive the equation.Without using matrix algebra, how do I derive it?My attempt: with ˆβ0=0, we have ˆβ1=ˉyˉx=SxySxx.

After some algebra, it can be shown that Sxy=n∑i=1x2i−nˉxˉy and Sxx=n∑i=1x2i−nˉx2. From here, I’m stuck.

**Answer**

This is straightforward from the Ordinary Least Squares definition. If there is no intercept, one is minimizing R(\beta) = \sum_{i=1}^{i=n} (y_i- \beta x_i)^2. This is smooth as a function of \beta, so all minima (or maxima) occur when the derivative is zero. Differentiating with respect to \beta we get -\sum_{i=1}^{i=n} 2(y_i- \beta x_i)x_i. Solving for \beta gives the formula.

**Attribution***Source : Link , Question Author : Clarinetist , Answer Author : Glen_b*