Over the last few decades, the (unconstrained) LASSO $$ \min_{x\in \mathbb{R}^n} \frac12 \lVert A x-b\rVert_2^2 + \lambda \lVert x\rVert_1, $$ has become an indispensable tool in statistical learning, data science, and signal processing, thanks to its ability to efficiently recover sparse approximate solutions to (underdetermined) linear systems.
In this talk, we will present a novel variational analysis of this popular optimization program. First, we will establish smoothness results as well as Lipschitz properties for the optimal value and optimal solution maps of the LASSO as functions of the measurement vector $b$, the sampling matrix $A$, and, most notably, the tuning parameter $\lambda$. Then, we will illustrate how to apply the proposed variational analysis in the context of compressed sensing, validating our theoretical findings with numerical experiments. Finally, we will discuss extensions to related optimization problems such as the the square-root LASSO.