[Sélecteur de Dantzig indépendant de la variance et application à lʼestimation robuste]
La calibration des méthodes dʼestimation parcimonieuses, telles que le Lasso et le sélecteur de Dantzig, nécessite souvent la connaissance a priori de la variance des erreurs. Nous proposons une méthode qui permet de sʼaffranchir de cette hypothèse, en estimant le vecteur de régression et la variance des erreurs de façon conjointe. Lʼestimateur qui en découle est calculable de manière efficace en résolvant un programme conique du second ordre. De plus, nous fournissons des garanties de risque pour cet estimateur presque aussi fortes que celles de lʼestimateur utilisant la connaissance de la variance des erreurs.
Sparse estimation methods based on relaxation, such as Lasso and Dantzig Selector, are powerful tools for estimating high dimensional linear models. However, in order to properly tune these methods, the variance of the noise is often used. In this paper, we propose a new approach to the joint estimation of the sparse vector and the noise variance in a high dimensional linear regression. The method is closely related to the maximum a posteriori estimation and has the attractive feature of being computable by solving a simple second-order cone program (SOCP). We establish nonasymptotic sharp risk bounds for the proposed estimator and show how it can be applied in the problem of robust estimation.
Accepté le :
Publié le :
Arnak S. Dalalyan 1
@article{CRMATH_2012__350_15-16_785_0, author = {Arnak S. Dalalyan}, title = {SOCP based variance free {Dantzig} {Selector} with application to robust estimation}, journal = {Comptes Rendus. Math\'ematique}, pages = {785--788}, publisher = {Elsevier}, volume = {350}, number = {15-16}, year = {2012}, doi = {10.1016/j.crma.2012.09.016}, language = {en}, }
Arnak S. Dalalyan. SOCP based variance free Dantzig Selector with application to robust estimation. Comptes Rendus. Mathématique, Volume 350 (2012) no. 15-16, pp. 785-788. doi : 10.1016/j.crma.2012.09.016. https://comptes-rendus.academie-sciences.fr/mathematique/articles/10.1016/j.crma.2012.09.016/
[1] Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist., Volume 37 (2009) no. 4, pp. 1705-1732
[2] The Dantzig selector: statistical estimation when p is much larger than n, Ann. Statist., Volume 35 (2007) no. 6, pp. 2313-2351
[3] The restricted isometry property and its implications for compressed sensing, C. R. Acad. Sci. Paris, Ser. I, Volume 346 (2008) no. 9–10, pp. 589-592
[4] A.S. Dalalyan, R. Keriven, -penalized robust estimation for a class of inverse problems arising in multiview geometry, in: NIPS, 2009, pp. 441–449.
[5] Robust estimation for an inverse problem arising in multiview geometry, J. Math. Imaging Vision, Volume 43 (2012) no. 1, pp. 10-23
[6] E. Gautier, A.B. Tsybakov, High-dimensional instrumental variables regression and confidence sets, Technical report, 2011, . | arXiv
[7] C. Giraud, S. Huet, N. Verzelen, High-dimensional regression with unknown variance, Stat. Sci. (2011), in press, . | arXiv
[8] Penalized regression, standard errors, and Bayesian lassos, Bayesian Anal., Volume 5 (2010) no. 2, pp. 369-411
[9] -penalization for mixture regression models, TEST, Volume 19 (2010) no. 2, pp. 209-256
[10] Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, Volume 58 (1996) no. 1, pp. 267-288
Cité par Sources :
Commentaires - Politique