« search calendars« DIMACS Workshop on Modeling Randomness in Neural Network Training: Mathematical, Statistical, and Numerical Guarantees

« Keynote 2: Practice, Theory, and Theorems for Random Matrix Theory in Modern Machine Learning

Keynote 2: Practice, Theory, and Theorems for Random Matrix Theory in Modern Machine Learning

June 06, 2024, 9:00 AM - 10:00 AM

Location:

DIMACS Center

Rutgers University

CoRE Building

96 Frelinghuysen Road

Piscataway, NJ 08854

Click here for map.

Michael Mahoney, University of California, Berkeley

Random Matrix Theory (RMT) has been applied to a wide range of areas over the years, and in recent years machine learning (ML) has been added to this list. In many cases, this leads to new types of theory, either predictive theory or mathematical theorems. Many aspects of modern ML are quite different than more traditional applications of RMT, and this is leading to new uses of and perspectives on RMT. Here, we’ll describe this, including both aspects of ML problem problem parameterization as well as empirical results on matrices arising in state-of-the-art ML models. Based on this, we’ll describe an RMT-based phenomenological theory that can be used, e.g., to predict trends in the quality of state-of-the-art neural networks without access to training or testing data. This is starting to lead to new RMT theorems of independent interest, some of which we will also describe.

 

[Video]