What is the difference between least squares and linear regression? Is it the same thing?

**Answer**

Linear regression assumes a linear relationship between the independent and dependent variable. It doesn’t tell you how the model is fitted. Least square fitting is simply one of the possibilities. Other methods for training a linear model is in the comment.

Non-linear least squares is common (https://en.wikipedia.org/wiki/Non-linear_least_squares). For example, the popular Levenberg–Marquardt algorithm solves something like:

ˆβ=argminβS(β)≡argminβm∑i=1[yi−f(xi,β)]2

It is a least squares optimization but the model is not linear.

**They are not the same thing**.

**Attribution***Source : Link , Question Author : bbadyalina , Answer Author : Firebug*