Katya Scheinberg joined the School of Operations Research and Information Engineering faculty in July 2019. She joined the ORIE faculty after serving as the Harvey E. Wagner Endowed Chair Professor at the Department of Industrial and Systems Engineering at Lehigh University. She was also a co-director of Lehigh Institute on Data, Intelligent Systems and Computation.
Professor Scheinberg was born in Moscow, Russia, and earned her undergraduate degree in operations research from the Lomonosov Moscow State University in 1992 and then received her Ph.D. in operations research from Columbia in 1997. She was a research staff member at the IBM T.J. Watson Research Center for over a decade, where she worked on various applied and theoretical problems in optimization.
Professor Scheinberg’s main research areas are related to developing practical algorithms and their theoretical analysis for various problems in continuous optimization, such as convex optimization, derivative free optimization, machine learning, quadratic programming, etc. She published a book in 2009 titled, Introduction to Derivative Free Optimization, which is co-authored with Andrew R. Conn and Luis N. Vicente. Recently some of her research focuses on the analysis of probabilistic methods and stochastic optimization with a variety of applications in machine learning and reinforcement learning.
Professor Scheinberg has taught courses on linear and nonlinear optimization, optimization models and applications. In 2009 she has developed one of the first graduate level courses on optimization methods for machine learning and she teaches and updates it regularly as the fields continues to develop.
- “Novel and Efficient Approximations for Zero-One Loss of Linear Classifiers” with Hiva Ghanbari, and Minhan Li, 2019.
- “A Stochastic Line Search Method with Convergence Rate Analysis”, with Courtney Paquette, 2018.
- "Convergence Rate Analysis of a Stochastic Trust Region Method via Submartingales", with Jose Blanchet, Coralia Cartis and Matt Menickelly, Informs Journal on Optimization, 2019.
- "Global convergence rate analysis of unconstrained optimization methods based on probabilistic models", with C. Cartis, Mathematical Programming, 2018.
- “SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient”, with L. Nguyen, Jie Liu and M. Takac, to appear in ICML 2017.
- "Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis", with Xiaocheng Tang, Mathematical Programming, 2016, 160(1-2) pp 495–529
- "Least-squares approach to risk parity in portfolio selection", with X. Bai and R. Tutuncu, Quantitative Finance, 2016, 16(3), pp 357-376.
- "Introduction to Derivative Free Optimization", with A. R. Conn and L. N. Vicente. Available from SIAM Series on Mathematical Programming.
Selected Awards and Honors
- Lagrange Prize in Continuous Optimization, MOS-SIAM prize for the best publication in past six years in the field of continuous optimization.
- IBM Research Division Award for contributions to COIN-OR, 2007.
Ph.D. (Operations Research), Columbia University, 1997
M.S. (Operations Research), Columbia University, 1994
B.S./M.S. (Computational Mathematics and Cybernetics), Moscow State University, 1992