« Stochastic Oracles and Where to Find Them
June 07, 2024, 10:50 AM - 11:35 AM
Location:
DIMACS Center
Rutgers University
CoRE Building
96 Frelinghuysen Road
Piscataway, NJ 08854
Click here for map.
Katya Scheinberg, Cornell University
The majority of continuous optimization methods developed in the last decade, especially in application to ML training, are developed under the assumption that approximate first order information is available to the method in some form. The assumption on the quality and reliability of this information can vary substantially from method to method. We will overview different methods of obtaining this information, including simple stochastic gradient via sampling, robust gradient estimation in adversarial settings, traditional and randomized finite difference methods and more. We will also consider second order and other related oracels. We will attempt to propose a somewhat unified definition of stochastic oracles, under which to compare what exists in the literature.
[Video]