Generic error bounds for the generalized lasso with sub-exponential data

Martin Genzel, Christian Kipp

Research output: Other contributionAcademic

Abstract

This work performs a non-asymptotic analysis of the generalized Lasso under the assumption of sub-exponential data. Our main results continue recent research on the benchmark case of (sub-)Gaussian sample distributions and thereby explore what conclusions are still valid when going beyond. While many statistical features of the generalized Lasso remain unaffected (e.g., consistency), the key difference becomes manifested in the way how the complexity of the hypothesis set is measured. It turns out that the estimation error can be controlled by means of two complexity parameters that arise naturally from a generic-chaining-based proof strategy. The output model can be non-realizable, while the only requirement for the input vector is a generic concentration inequality of Bernstein-type, which can be implemented for a variety of sub-exponential distributions. This abstract approach allows us to reproduce, unify, and extend previously known guarantees for the generalized Lasso. In particular, we present applications to semi-parametric output models and phase retrieval via the lifted Lasso. Moreover, our findings are discussed in the context of sparse recovery and high-dimensional estimation problems. MSC Codes 60D05, 62F30, 62F35, 90C25
Original languageEnglish
PublisherarXiv
Publication statusPublished - 18 May 2020

Keywords

  • Generalized Lasso
  • Generic chaining
  • High-dimensional parameter estimation
  • Statistical learning
  • Sub-exponential data

Fingerprint

Dive into the research topics of 'Generic error bounds for the generalized lasso with sub-exponential data'. Together they form a unique fingerprint.

Cite this