Comptes Rendus
Statistics
SOCP based variance free Dantzig Selector with application to robust estimation
[Sélecteur de Dantzig indépendant de la variance et application à lʼestimation robuste]
Comptes Rendus. Mathématique, Volume 350 (2012) no. 15-16, pp. 785-788.

La calibration des méthodes dʼestimation parcimonieuses, telles que le Lasso et le sélecteur de Dantzig, nécessite souvent la connaissance a priori de la variance des erreurs. Nous proposons une méthode qui permet de sʼaffranchir de cette hypothèse, en estimant le vecteur de régression et la variance des erreurs de façon conjointe. Lʼestimateur qui en découle est calculable de manière efficace en résolvant un programme conique du second ordre. De plus, nous fournissons des garanties de risque pour cet estimateur presque aussi fortes que celles de lʼestimateur utilisant la connaissance de la variance des erreurs.

Sparse estimation methods based on 1 relaxation, such as Lasso and Dantzig Selector, are powerful tools for estimating high dimensional linear models. However, in order to properly tune these methods, the variance of the noise is often used. In this paper, we propose a new approach to the joint estimation of the sparse vector and the noise variance in a high dimensional linear regression. The method is closely related to the maximum a posteriori estimation and has the attractive feature of being computable by solving a simple second-order cone program (SOCP). We establish nonasymptotic sharp risk bounds for the proposed estimator and show how it can be applied in the problem of robust estimation.

Reçu le :
Accepté le :
Publié le :
DOI : 10.1016/j.crma.2012.09.016
Arnak S. Dalalyan 1

1 ENSAE/CREST/GENES, 3, avenue Pierre-Larousse, 92245 Malakoff cedex, France
@article{CRMATH_2012__350_15-16_785_0,
     author = {Arnak S. Dalalyan},
     title = {SOCP based variance free {Dantzig} {Selector} with application to robust estimation},
     journal = {Comptes Rendus. Math\'ematique},
     pages = {785--788},
     publisher = {Elsevier},
     volume = {350},
     number = {15-16},
     year = {2012},
     doi = {10.1016/j.crma.2012.09.016},
     language = {en},
}
TY  - JOUR
AU  - Arnak S. Dalalyan
TI  - SOCP based variance free Dantzig Selector with application to robust estimation
JO  - Comptes Rendus. Mathématique
PY  - 2012
SP  - 785
EP  - 788
VL  - 350
IS  - 15-16
PB  - Elsevier
DO  - 10.1016/j.crma.2012.09.016
LA  - en
ID  - CRMATH_2012__350_15-16_785_0
ER  - 
%0 Journal Article
%A Arnak S. Dalalyan
%T SOCP based variance free Dantzig Selector with application to robust estimation
%J Comptes Rendus. Mathématique
%D 2012
%P 785-788
%V 350
%N 15-16
%I Elsevier
%R 10.1016/j.crma.2012.09.016
%G en
%F CRMATH_2012__350_15-16_785_0
Arnak S. Dalalyan. SOCP based variance free Dantzig Selector with application to robust estimation. Comptes Rendus. Mathématique, Volume 350 (2012) no. 15-16, pp. 785-788. doi : 10.1016/j.crma.2012.09.016. https://comptes-rendus.academie-sciences.fr/mathematique/articles/10.1016/j.crma.2012.09.016/

[1] P.J. Bickel; Y. Ritov; A.B. Tsybakov Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist., Volume 37 (2009) no. 4, pp. 1705-1732

[2] E. Candes; T. Tao The Dantzig selector: statistical estimation when p is much larger than n, Ann. Statist., Volume 35 (2007) no. 6, pp. 2313-2351

[3] E.J. Candès The restricted isometry property and its implications for compressed sensing, C. R. Acad. Sci. Paris, Ser. I, Volume 346 (2008) no. 9–10, pp. 589-592

[4] A.S. Dalalyan, R. Keriven, L1-penalized robust estimation for a class of inverse problems arising in multiview geometry, in: NIPS, 2009, pp. 441–449.

[5] A.S. Dalalyan; R. Keriven Robust estimation for an inverse problem arising in multiview geometry, J. Math. Imaging Vision, Volume 43 (2012) no. 1, pp. 10-23

[6] E. Gautier, A.B. Tsybakov, High-dimensional instrumental variables regression and confidence sets, Technical report, 2011, . | arXiv

[7] C. Giraud, S. Huet, N. Verzelen, High-dimensional regression with unknown variance, Stat. Sci. (2011), in press, . | arXiv

[8] M. Kyung; J. Gill; M. Ghosh; G. Casella Penalized regression, standard errors, and Bayesian lassos, Bayesian Anal., Volume 5 (2010) no. 2, pp. 369-411

[9] N. Städler; P. Bühlmann; S. van de Geer 1-penalization for mixture regression models, TEST, Volume 19 (2010) no. 2, pp. 209-256

[10] R. Tibshirani Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, Volume 58 (1996) no. 1, pp. 267-288

Cité par Sources :

Commentaires - Politique


Ces articles pourraient vous intéresser

A necessary and sufficient condition for exact sparse recovery by 1 minimization

Charles Dossal

C. R. Math (2012)