M. David and . Allen, The relationship between variable selection and data augmentation and a method for prediction, Technometrics, vol.16, pp.125-127, 1974.

S. Arlot, V -fold cross-validation improved: V -fold penalization, 2008.
URL : https://hal.archives-ouvertes.fr/hal-00239182

S. Arlot and A. Celisse, A survey of cross-validation procedures for model selection, Statistics Surveys, vol.4, issue.0, pp.40-79, 2010.
DOI : 10.1214/09-SS054

URL : https://hal.archives-ouvertes.fr/hal-00407906

L. Peter, S. Bartlett, and . Mendelson, Rademacher and gaussian complexities: risk bounds and structural results, Journal of Machine Learning Research, vol.3, pp.463-482, 2003.

B. E. Boser, I. M. Guyon, and V. N. Vapnik, A training algorithm for

L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and regression trees, 1984.

L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and regression trees, 1984.

L. Breiman and P. Spector, Submodel Selection and Evaluation in Regression. The X-Random Case, International Statistical Review / Revue Internationale de Statistique, vol.60, issue.3, pp.291-319, 1992.
DOI : 10.2307/1403680

P. Burman, -fold cross-validation and the repeated learning-testing methods, Biometrika, vol.76, issue.3, pp.503-514, 1989.
DOI : 10.1093/biomet/76.3.503

URL : https://hal.archives-ouvertes.fr/hal-00530378

W. R. Burrows, CART Regression Models for Predicting UV Radiation at the Ground in the Presence of Cloud and Other Environmental Factors, Journal of Applied Meteorology, vol.36, issue.5, pp.531-544, 1997.
DOI : 10.1175/1520-0450(1997)036<0531:CRMFPU>2.0.CO;2

M. Chang and C. Lin, Leave-One-Out Bounds for Support Vector Regression Model Selection, Neural Computation, vol.20, issue.5, pp.1188-1222, 2005.
DOI : 10.1162/089976600300015042

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.305.487

V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, vol.17, issue.1, pp.113-126, 2004.
DOI : 10.1016/S0893-6080(03)00169-2

P. A. Chou, T. Lookabaugh, R. M. Gray-harris-drucker, C. J. Burges, L. Kaufman et al., Optimal pruning with applications to treestructured source coding and modeling. Information Theory Support vector regression machines, Advances in Neural Information Processing Systems 9, pp.299-315, 1989.
DOI : 10.1109/18.32124

B. Efron, Bootstrap Methods: Another Look at the Jackknife, The Annals of Statistics, vol.7, issue.1, pp.1-26, 1979.
DOI : 10.1214/aos/1176344552

S. Geisser, The Predictive Sample Reuse Method with Applications, Journal of the American Statistical Association, vol.36, issue.2, pp.320-328, 1975.
DOI : 10.1007/BF02297848

S. Gey and E. Nedelec, Model selection for cart regression trees. Information Theory, IEEE Transactions on, vol.51, issue.2, pp.658-670, 2005.
DOI : 10.1109/tit.2004.840903

URL : https://hal.archives-ouvertes.fr/hal-00326549

I. Guyon, B. Boser, and V. Vapnik, Automatic Capacity Tuning of Very Large VC-dimension Classifiers, Advances in Neural Information Processing Systems, pp.147-155, 1993.

T. Hastie, S. Rosset, R. Tibshirani, and J. Zhu, The entire regularization path for the support vector machine, Journal of Machine Learning Research, vol.5, pp.1391-1415, 2004.

T. Hastie, R. Tibshirani, and J. Friedman, The elements of statistical learning, Data mining, inference, and prediction, 2001.

W. He, Z. Wang, and H. Jiang, Model optimizing and feature selecting for support vector regression in time series forecasting, Neurocomputing, vol.72, issue.1-3, pp.600-611, 2008.
DOI : 10.1016/j.neucom.2007.11.010

V. Koltchinskii, Local Rademacher complexities and oracle inequalities in risk minimization, The Annals of Statistics, vol.34, issue.6, pp.2593-2656, 2006.
DOI : 10.1214/009053606000001019

J. T. Kwok, Linear Dependency between ?? and the Input Noise in ??-Support Vector Regression, ICANN, pp.405-410, 2001.
DOI : 10.1007/3-540-44668-0_57

P. Liang and N. Srebro, On the interaction between norm and dimensionality: Multiple regimes in learning, International Conference on Machine Learning (ICML), 2010.

C. L. Mallows, Some comments on C p, Technometrics, vol.15, pp.661-675, 1973.
DOI : 10.2307/1271437

M. Momma and K. P. Bennett, A Pattern Search Method for Model Selection of Support Vector Regression, Proceedings of the SIAM International Conference on Data Mining. SIAM, 2002.
DOI : 10.1137/1.9781611972726.16

R. Neal, Assessing Relevance Determination Methods Using DELVE, Neural Networks and Machine Learning, pp.97-129, 1998.

C. Ong, S. Shao, and J. Yang, An improved algorithm for the solution of the regularization path of support vector machine, Trans. Neur. Netw, vol.21, pp.451-462, 2010.

F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion et al., Scikit-learn: Machine Learning in Python, Journal of Machine Learning Research, vol.12, pp.2825-2830, 2011.
URL : https://hal.archives-ouvertes.fr/hal-00650905

J. D. Rodriguez, A. Perez, and J. A. Lozano, Sensitivity analysis of k-fold cross validation in prediction error estimation. Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.32, issue.3, pp.569-575, 2010.

B. Schölkopf, A. Smola, and . Shao, Learning with kernels Linear model selection by cross-validation, J. Amer. Statist. Assoc, vol.33, issue.88422, pp.486-494, 1993.

A. Smola, N. Murata, B. , and K. R. Müller, Asymptotically Optimal Choice of ?-Loss for Support Vector Machines, Proceedings of the 8th International Conference on Artificial Neural Networks, Perspectives in Neural Computing, 1998.

A. J. Smola and B. Schölkopf, A tutorial on support vector regression, Statistics and Computing, vol.14, issue.3, pp.199-222, 2004.
DOI : 10.1023/B:STCO.0000035301.49549.88

V. N. Vapnik and A. Ya, Chervonenkis. Ordered risk minimization. Automation and Remote Control, pp.1226-1235, 1974.

N. Vladimir and . Vapnik, The nature of statistical learning theory, 1995.

K. D. Wernecke, K. Possinger, G. Kalb, and J. Stein, Validating Classification Trees, Biometrical Journal, vol.40, issue.8, pp.993-1005, 1998.
DOI : 10.1002/(SICI)1521-4036(199812)40:8<993::AID-BIMJ993>3.0.CO;2-T