# What is the time complexity of Lasso regression?

What is the asymptotic time complexity of Lasso regression as either the number of rows or columns grows?

Recall that lasso is a linear model with a $l_1$ regularization.

Finding the parameters can be formulated as a unconstrained optimization problem, where the parameters are given by

$\arg \min_\beta ||y - X\beta||^2 + \alpha||\beta||_1$.

In the constrained formuation the parameters are given by

$\arg \min_\beta ||y - X\beta||^2 s.t.||\beta||_1 < \alpha$

Which is a quadratic programming problem and thus polynomial.

Almost all convex optimization routines, even for flexible nonlinear things like neural networks, rely on computing the derivative of your target w.r.t. parameters. You cannot take the derivative of $\alpha||w||_1$ though. As such you rely on different techniques. There are many methods for finding the parameters. Here's a review paper on the subject, Least Squares Optimization with L1-Norm Regularization. Time-complexity of iterative convex optimization is kind of tricky to analyze, as it depends on a convergence criterion. Generally, iterative problems converge in fewer epochs as the observations increase.