F. Abramovich, Y. Benjamini, D. L. Donoho, and I. M. Johnstone, Adapting to unknown sparsity by controlling the false discovery rate. The Annals of Statistics, pp.584-653, 2006.

H. Akaike, Information theory and an extension of the maximum likelihood principle, Second International Symposium on Information Theory (Tsahkadsor, 1971), pp.267-281, 1973.

P. Alquier, Iterative feature selection in least square regression estimation, Annales de l'Institut Henri Poincare (B) Probability and Statistics, vol.44, issue.1, pp.47-88, 2008.
DOI : 10.1214/07-AIHP106

URL : https://hal.archives-ouvertes.fr/hal-00013780

A. Antoniadis and J. Fan, Regularization of Wavelet Approximations, Journal of the American Statistical Association, vol.96, issue.455, pp.939-967, 2001.
DOI : 10.1198/016214501753208942

F. Bach, High-dimensional non-linear variable selection through hierarchical kernel learning, 2009.
URL : https://hal.archives-ouvertes.fr/hal-00413473

Y. Baraud, S. Huet, and B. Laurent, Adaptive tests of linear hypotheses by model selection, Ann. Statist, vol.31, issue.1, pp.225-251, 2003.

Y. Baraud, S. Huet, and B. Laurent, Testing convex hypotheses on the mean of a Gaussian vector. Application to testing qualitative hypotheses on a regression function, The Annals of Statistics, vol.33, issue.1, pp.214-257, 2005.
DOI : 10.1214/009053604000000896

URL : https://hal.archives-ouvertes.fr/hal-00756077

K. Bertin and G. Lecué, Selection of variables and dimension reduction in high-dimensional non-parametric regression, Electronic Journal of Statistics, vol.2, issue.0, pp.1224-1241, 2008.
DOI : 10.1214/08-EJS327

J. Peter, . Bickel, A. B. Ritov, and . Tsybakov, Hierarchical selection of variables in sparse high-dimensional regression. Borrowing Strength : Theory Powering Applications -A Festschrift for Lawrence D, Brown. IMS Collections, vol.6, pp.56-69, 2010.

P. J. Bickel and E. Levina, Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations, Bernoulli, vol.10, issue.6, pp.989-1010, 2004.
DOI : 10.3150/bj/1106314847

L. Breiman, Better Subset Regression Using the Nonnegative Garrote, Technometrics, vol.37, issue.4, pp.373-384, 1995.
DOI : 10.1080/01621459.1980.10477428

D. Lawrence, M. G. Brown, and . Low, Asymptotic equivalence of nonparametric regression and white noise, Ann. Statist, vol.24, issue.6, pp.2384-2398, 1996.

L. D. Brown, A. V. Carter, M. G. Low, and C. Zhang, Equivalence theory for density estimation, Poisson processes and Gaussian white noise with drift, Ann. Statist, vol.32, issue.5, pp.2074-2097, 2004.

F. Bunea and A. Barbu, Dimension reduction and variable selection in case control studies via regularized likelihood optimization, Electronic Journal of Statistics, vol.3, issue.0, pp.1257-1287, 2009.
DOI : 10.1214/09-EJS537

C. Butucea, Goodness-of-fit testing and quadratic functional estimation from indirect observations, The Annals of Statistics, vol.35, issue.5, pp.1907-1930, 2007.
DOI : 10.1214/009053607000000118

URL : https://hal.archives-ouvertes.fr/hal-00204453

T. Tony, M. G. Cai, and . Low, Optimal adaptive estimation of a quadratic functional, Ann. Statist, vol.34, issue.5, pp.2298-2325, 2006.

A. Cohen, Wavelet methods in numerical analysis, Mathematics and its Applications, 2003.
DOI : 10.1016/S1570-8659(00)07004-6

O. Collier, Minimax hypothesis testing for curve registration, Electronic Journal of Statistics, vol.6, issue.0, 2012.
DOI : 10.1214/12-EJS706

URL : https://hal.archives-ouvertes.fr/hal-00619808

L. Comminges and A. S. Dalalyan, Tight conditions for consistency of variable selection in the context of high dimensionality, The Annals of Statistics, vol.40, issue.5, 2011.
DOI : 10.1214/12-AOS1046SUPP

URL : https://hal.archives-ouvertes.fr/hal-00602211

L. Comminges, Conditions minimales de consistance pour la s??lection de variables en grande dimension, Comptes Rendus Mathematique, vol.349, issue.7-8, pp.469-472, 2011.
DOI : 10.1016/j.crma.2011.02.014

L. Comminges and A. S. Dalalyan, Tight conditions for consistent variable selection in high dimensional nonparametric regression, J.Mach. Learn. Res. -Proceedings Track, vol.19, pp.187-206, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00566721

A. S. Dalalyan and O. Collier, Wilks' phenomenon and penalized likelihood-ratio test for nonparametric curve registration, J. Mach. Learn. Res. -Proceedings Track, vol.22, pp.264-272, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00705796

A. Dalalyan and M. Reiß, Asymptotic statistical equivalence for scalar ergodic diffusions. Probab. Theory Related Fields, pp.248-282, 2006.
URL : https://hal.archives-ouvertes.fr/hal-00016596

S. Arnak, A. Dalalyan, V. Juditsky, and . Spokoiny, A new algorithm for estimating the effective dimension-reduction subspace, J. Mach. Learn. Res, vol.9, pp.1648-1678, 2008.

J. Dieudonné, Calcul infinitésimal, 1968.

D. Donoho and J. Jin, Feature selection by higher criticism thresholding achieves the optimal phase diagram, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol.99, issue.10, pp.4449-4470, 1906.
DOI : 10.1073/pnas.082099299

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression. The Annals of statistics, pp.407-499, 2004.

M. Ermakov, Nonparametric signal detection with small type I and type II error probabilities, Statistical Inference for Stochastic Processes, vol.25, issue.1, pp.1-19, 2011.
DOI : 10.1007/s11203-010-9048-5

M. S. Ermakov, Minimax detection of a signal in Gaussian white noise, Teor. Veroyatnost . i Primenen, vol.35, issue.4, pp.704-715, 1990.

J. Fan, Comments on ??Wavelets in statistics: A review?? by A. Antoniadis, Journal of the Italian Statistical Society, vol.58, issue.2, pp.131-138, 1997.
DOI : 10.1007/BF03178906

J. Fan and Y. Fan, High-dimensional classification using features annealed independence rules, The Annals of Statistics, vol.36, issue.6, p.2605, 2008.
DOI : 10.1214/07-AOS504

J. Fan and J. Lv, Sure independence screening for ultrahigh dimensional feature space, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.36, issue.5, pp.849-911, 2008.
DOI : 10.1111/j.1467-9868.2008.00674.x

J. Fan and R. Song, Sure independence screening in generalized linear models with np-dimensionality. The Annals of Statistics, pp.3567-3604, 2010.

J. Fan, Y. Feng, and R. Song, Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models, Journal of the American Statistical Association, vol.106, issue.494, pp.544-557, 2011.
DOI : 10.1198/jasa.2011.tm09779

J. Fan and R. Li, Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties, Journal of the American Statistical Association, vol.96, issue.456, pp.1348-1360, 2001.
DOI : 10.1198/016214501753382273

J. Fan and J. Lv, Nonconcave Penalized Likelihood With NP-Dimensionality, IEEE Transactions on Information Theory, vol.57, issue.8, pp.5467-5484, 2011.
DOI : 10.1109/TIT.2011.2158486

J. Fan, R. Samworth, and Y. Wu, Ultrahigh dimensional feature selection : beyond the linear model, J. Mach. Learn. Res, vol.10, pp.2013-2038, 2009.

S. Gaïffas and G. Lecué, Optimal rates and adaptation in the single-index model using aggregation, Electronic Journal of Statistics, vol.1, issue.0, pp.538-573, 2007.
DOI : 10.1214/07-EJS077

G. Gayraud and C. Pouet, Adaptive minimax testing in the discrete regression scheme. Probab. Theory Related Fields, pp.531-558, 2005.

G. Gayraud, . Ch, and . Pouet, Minimax testing composite null hypotheses in the discrete regression scheme, Math. Methods Statist. Meeting on Mathematical Statistics, vol.10, issue.4, pp.375-394, 2000.

G. Gayraud and Y. Ingster, Detection of sparse variable functions, 2011.

K. Georgi, M. Golubev, H. H. Nussbaum, and . Zhou, Asymptotic equivalence of spectral density estimation and Gaussian white noise, Ann. Statist, vol.38, issue.1, pp.181-21409, 2010.

P. Hall, Central limit theorem for integrated square error of multivariate nonparametric density estimators, Journal of Multivariate Analysis, vol.14, issue.1, pp.1-16, 1984.
DOI : 10.1016/0047-259X(84)90044-7

M. Hebiri, Sparse conformal predictors, Statistics and Computing, vol.35, issue.2, pp.253-266, 2010.
DOI : 10.1007/s11222-009-9167-2

URL : https://hal.archives-ouvertes.fr/hal-00360771

J. L. Horowitz and V. G. Spokoiny, An Adaptive, Rate-Optimal Test of a Parametric Mean-Regression Model Against a Nonparametric Alternative, Econometrica, vol.69, issue.3, pp.599-631, 2001.
DOI : 10.1111/1468-0262.00207

J. Huang and T. Zhang, The benefit of group sparsity, The Annals of Statistics, vol.38, issue.4, pp.1978-2004, 2010.
DOI : 10.1214/09-AOS778

Y. I. Ingster, Asymptotically minimax hypothesis testing for nonparametric alternatives, I. Math. Methods Statist, vol.2, issue.2, pp.85-114, 1993.

Y. I. Ingster, Asymptotically minimax hypothesis testing for nonparametric alternatives, II. Math. Methods Statist, vol.2, issue.3, pp.171-189, 1993.

Y. I. Ingster, Asymptotically minimax hypothesis testing for nonparametric alternatives, III. Math. Methods Statist, vol.2, issue.4, pp.249-268, 1993.

Y. I. Ingster and T. Sapatinas, Minimax goodness-of-fit testing in multivariate nonparametric regression, Mathematical Methods of Statistics, vol.18, issue.3, pp.241-269, 2009.
DOI : 10.3103/S1066530709030041

Y. I. Ingster and I. A. Suslina, Nonparametric goodness-of-fit testing under Gaussian models, Lecture Notes in Statistics, vol.169, 2003.
DOI : 10.1007/978-0-387-21580-8

Y. I. Ingster, T. Sapatinas, and I. A. Suslina, Minimax signal detection in ill-posed inverse problems, The Annals of Statistics, vol.40, issue.3, 2012.
DOI : 10.1214/12-AOS1011SUPP

Y. Ingster and N. Stepanova, Estimation and detection of functions from anisotropic Sobolev classes, Electronic Journal of Statistics, vol.5, issue.0, pp.484-506, 2011.
DOI : 10.1214/11-EJS615

Y. Ingster and I. Suslina, Estimation and hypothesis testing for functions from tensor products of spaces, Zap. Nauchn. Sem. S.-Peterburg. Otdel. Mat. Inst. Steklov. (POMI), vol.351, issue.12, pp.180-218, 2007.

R. Jenatton, J. Audibert, and F. Bach, Structured variable selection with sparsity-inducing norms, J.Mach. Learn. Res, vol.12, pp.2777-2824, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00377732

H. Kneser, Sur un th??or??me fondamental de la th??orie des jeux [64???52b], C. R. Acad. Sci. Paris, vol.234, pp.2418-2420, 1952.
DOI : 10.1515/9783110894516.484

V. Koltchinskii and M. Yuan, Sparsity in multiple kernel learning, The Annals of Statistics, vol.38, issue.6, pp.3660-3695, 2010.
DOI : 10.1214/10-AOS825

V. I. Kolyada, On an embedding of Sobolev spaces, Mathematical Notes, vol.113, issue.No. 4, pp.48-71, 1993.
DOI : 10.1007/BF01209556

J. Lafferty and L. Wasserman, Rodeo: Sparse, greedy nonparametric regression, The Annals of Statistics, vol.36, issue.1, pp.28-63, 2008.
DOI : 10.1214/009053607000000811

B. Laurent, J. M. Loubes, and C. Marteau, Testing inverse problems: A direct or an indirect problem?, Journal of Statistical Planning and Inference, vol.141, issue.5, pp.1849-1861, 2011.
DOI : 10.1016/j.jspi.2010.11.035

URL : https://hal.archives-ouvertes.fr/hal-00528909

B. Laurent, J. M. Loubes, and C. Marteau, Non asymptotic minimax rates of testing in signal detection with heterogeneous variances, Electronic Journal of Statistics, vol.6, issue.0, pp.91-122, 2012.
DOI : 10.1214/12-EJS667

URL : https://hal.archives-ouvertes.fr/hal-00440825

B. Laurent and P. Massart, Adaptive estimation of a quadratic functional of a density by model selection, ESAIM: Probability and Statistics, vol.9, issue.5, pp.1302-1338, 2000.
DOI : 10.1051/ps:2005001

E. Frank and J. H. Friedman, A Statistical View of Some Chemometrics Regression Tools, Technometrics, vol.5, issue.2, pp.109-135, 1993.
DOI : 10.1080/00401706.1993.10485033

K. Lounici, M. Pontil, A. B. Tsybakov, and S. Van-de-geer, Oracle inequalities and optimal inference under group sparsity, The Annals of Statistics, vol.39, issue.4, pp.2164-2204, 2011.
DOI : 10.1214/11-AOS896

URL : https://hal.archives-ouvertes.fr/hal-00501509

J. Lv and Y. Fan, A unified approach to model selection and sparse recovery using regularized least squares. The Annals of Statistics, pp.3498-3528, 2009.

S. G. Mallat, A wavelet tour of signal processing, Academic Pr, 1999.

C. L. Mallows, Some comments on C p, Technometrics, vol.15, pp.661-675, 1973.

J. Mazo and A. Odlyzko, Lattice points in high-dimensional spheres, Monatshefte f??r Mathematik, vol.31, issue.1, pp.47-61, 1990.
DOI : 10.1007/BF01571276

N. Meinshausen and P. Bühlmann, High-dimensional graphs and variable selection with the lasso. The Annals of Statistics, pp.1436-1462, 2006.

N. Meinshausen and P. Bühlmann, Stability selection, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.7, issue.4, pp.417-473, 2010.
DOI : 10.1111/j.1467-9868.2010.00740.x

M. Nussbaum, Asymptotic equivalence of density estimation and Gaussian white noise, The Annals of Statistics, vol.24, issue.6, pp.2399-2430, 1996.
DOI : 10.1214/aos/1032181160

G. Obozinski, M. J. Wainwright, and M. I. Jordan, Support union recovery in high-dimensional multivariate regression, The Annals of Statistics, vol.39, issue.1, pp.1-47, 2011.
DOI : 10.1214/09-AOS776

C. Pouet, An asymptotically optimal test for a parametric set of regression functions against a non-parametric alternative, Journal of Statistical Planning and Inference, vol.98, issue.1-2, pp.177-189, 2001.
DOI : 10.1016/S0378-3758(00)00300-1

G. Raskutti, M. J. Wainwright, and B. Yu, Minimax-optimal rates for sparse additive models over kernel classes via convex programming, 2011.

P. Ravikumar, M. J. Wainwright, and J. D. Lafferty, High-dimensional Ising model selection using ??? 1 -regularized logistic regression, The Annals of Statistics, vol.38, issue.3, pp.1287-1319, 2010.
DOI : 10.1214/09-AOS691

M. Reiß, Asymptotic equivalence for nonparametric regression with multivariate and random design, The Annals of Statistics, vol.36, issue.4, pp.1957-1982, 2008.
DOI : 10.1214/07-AOS525

A. Samarov, V. Spokoiny, and C. Vial, Component Identification and Estimation in Nonlinear High-Dimensional Regression Models by Structural Adaptation, Journal of the American Statistical Association, vol.100, issue.470, pp.429-445, 2005.
DOI : 10.1198/016214504000001529

URL : https://hal.archives-ouvertes.fr/hal-00377568

G. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

G. James, J. O. Scott, and . Berger, Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem, Ann. Statist, vol.38, issue.5, pp.2587-2619, 2010.

J. Shao, An asymptotic theory for linear model selection, Statistica Sinica, vol.7, pp.221-242, 1997.

V. G. Spokoiny, Adaptive hypothesis testing using wavelets, The Annals of Statistics, vol.24, issue.6, pp.2477-2498, 1996.
DOI : 10.1214/aos/1032181163

R. Tibshirani, Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, vol.58, issue.1, pp.267-288, 1996.

A. B. Tsybakov, Introduction to nonparametric estimation, 2009.
DOI : 10.1007/b13794

R. Vershynin, Introduction to the non-asymptotic analysis of random matrices, 2012.
DOI : 10.1017/CBO9780511794308.006

N. Verzelen, Minimax risks for sparse regressions: Ultra-high dimensional phenomenons, Electronic Journal of Statistics, vol.6, issue.0, pp.38-90, 2012.
DOI : 10.1214/12-EJS666SUPP

URL : https://hal.archives-ouvertes.fr/hal-00508339

M. J. Wainwright, Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting, IEEE Transactions on Information Theory, vol.55, issue.12, pp.5728-5741, 2009.
DOI : 10.1109/TIT.2009.2032816

M. Wainwright, Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using <formula formulatype="inline"><tex Notation="TeX">$\ell _{1}$</tex> </formula>-Constrained Quadratic Programming (Lasso), IEEE Transactions on Information Theory, vol.55, issue.5, pp.2183-2202, 2009.
DOI : 10.1109/TIT.2009.2016018

H. Wang, R. Li, and C. L. Tsai, Tuning parameter selectors for the smoothly clipped absolute deviation method, Biometrika, vol.94, issue.3, pp.553-568, 2007.
DOI : 10.1093/biomet/asm053

L. Wasserman and K. Roeder, High-dimensional variable selection, The Annals of Statistics, vol.37, issue.5A, pp.2178-2201, 2009.
DOI : 10.1214/08-AOS646

T. T. Wu and K. Lange, Coordinate descent algorithms for lasso penalized regression. The Annals of Applied Statistics, pp.224-244, 2008.

Y. Yang, Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation, Biometrika, vol.92, issue.4, pp.937-950, 2005.
DOI : 10.1093/biomet/92.4.937

M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.58, issue.1, pp.49-67, 2006.
DOI : 10.1198/016214502753479356

C. Zhang, Nearly unbiased variable selection under minimax concave penalty, The Annals of Statistics, vol.38, issue.2, pp.894-942, 2010.
DOI : 10.1214/09-AOS729

T. Zhang, On the consistency of feature selection using greedy least squares regression, Journal of Machine Learning Research, vol.10, pp.555-568, 2009.

P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res, vol.7, pp.2541-2563, 2006.

P. Zhao, G. Rocha, and B. Yu, The composite absolute penalties family for grouped and hierarchical variable selection, The Annals of Statistics, vol.37, issue.6A, pp.3468-3497, 2009.
DOI : 10.1214/07-AOS584

H. Zou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, pp.1418-1429, 2006.
DOI : 10.1198/016214506000000735

H. Zou and R. Li, One-step sparse estimates in nonconcave penalized likelihood models, The Annals of Statistics, vol.36, issue.4, p.1509, 2008.
DOI : 10.1214/009053607000000802

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, pp.2173-2192, 2007.
DOI : 10.1214/009053607000000127