R. G. Andrzejak, K. Lehnertz, F. Mormann, C. Rieke, P. David et al., Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: Dependence on recording region and brain state, Physical Review E, vol.64, issue.6, p.61907, 2001.

J. Baik, G. B. Arous, and S. Péché, Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices, The Annals of Probability, vol.33, issue.5, pp.1643-1697, 2005.

X. Cheng and A. Singer, The spectrum of random innerproduct kernel matrices, Random Matrices: Theory and Applications, vol.2, p.1350010, 2013.

R. Couillet and F. Benaych-georges, Kernel spectral clustering of large dimensional data, Electronic Journal of Statistics, vol.10, issue.1, pp.1393-1454, 2016.
URL : https://hal.archives-ouvertes.fr/hal-01215343

N. El-karoui, The spectrum of kernel random matrices, The Annals of Statistics, vol.38, issue.1, pp.1-50, 2010.

R. A. Horn and C. R. Johnson, Matrix analysis, 2012.

G. Huang, H. Zhou, X. Ding, and R. Zhang, Extreme learning machine for regression and multiclass classification, IEEE Transactions on Systems, Man, and Cybernetics, vol.42, pp.513-529, 2012.

N. Keriven, A. Bourrier, R. Gribonval, and P. Pérez, Sketching for large-scale learning of mixture models, Acoustics, Speech and Signal Processing, p.2016
URL : https://hal.archives-ouvertes.fr/hal-01208027

, IEEE International Conference on, pp.6190-6194, 2016.

A. Krizhevsky, I. Sutskever, and G. E. Hinton, Imagenet classification with deep convolutional neural networks, Advances in neural information processing systems, pp.1097-1105, 2012.

Y. Lecun, C. Cortes, and C. J. Burges, The MNIST database of handwritten digits, 1998.

T. P. Lillicrap, D. Cownden, D. B. Tweed, and C. J. Akerman, Random synaptic feedback weights support error backpropagation for deep learning, Nature communications, vol.7, 2016.

C. Louart, Z. Liao, and R. Couillet, A random matrix approach to neural networks, The Annals of Applied Probability, vol.28, issue.2, pp.1190-1248, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01957656

A. L. Maas, A. Y. Hannun, and A. Y. Ng, Rectifier nonlinearities improve neural network acoustic models, Proc. ICML, vol.30, 2013.

V. A. Mar?enko and L. A. Pastur, Distribution of eigenvalues for some sets of random matrices, Mathematics of the USSR-Sbornik, vol.1, issue.4, p.457, 1967.

A. Y. Ng, M. I. Jordan, and Y. Weiss, On spectral clustering: Analysis and an algorithm, Advances in neural information processing systems, pp.849-856, 2002.

J. Pennington and P. Worah, Nonlinear random matrix theory for deep learning, Advances in Neural Information Processing Systems, pp.2634-2643, 2017.

A. Rahimi and B. Recht, Random features for large-scale kernel machines, Advances in neural information processing systems, pp.1177-1184, 2008.

S. Scardapane and D. Wang, Randomness in neural networks: an overview. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, vol.7, 2017.

J. Schmidhuber, Deep learning in neural networks: An overview, Neural networks, vol.61, pp.85-117, 2015.

B. Schölkopf and A. J. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond, 2002.

J. W. Silverstein and Z. Bai, On the empirical distribution of eigenvalues of a class of large dimensional random matrices, Journal of Multivariate analysis, vol.54, issue.2, pp.175-192, 1995.

T. Ali, H. Couillet, and R. , Spectral community detection in heterogeneous large networks, 2016.
URL : https://hal.archives-ouvertes.fr/hal-01957623

A. Vedaldi and A. Zisserman, Efficient additive kernels via explicit feature maps, IEEE transactions on pattern analysis and machine intelligence, vol.34, pp.480-492, 2012.

C. K. Williams, Computing with infinite networks. Advances in neural information processing systems, pp.295-301, 1997.