Faculty Recruiting Support CICS

Chasing Convex Bodies and Functions with Black-Box Advice

21 Oct
Friday, 10/21/2022 1:00pm to 2:00pm
Lederle Graduate Research Center, Room A215; Zoom
Data Science Deep Dive

Abstract: Modern AI/ML algorithms have the potential to improve performance over traditional algorithms for online optimization, which are designed to give worst-case guarantees. However, while AI/ML algorithms might perform well when the training data accurately reflects the deployment environment, they lack worst-case performance guarantees, e.g., under distribution shift. This hinders their use in real-world settings where safety/performance guarantees are crucial. In this talk, I will discuss recent work designing algorithms for online optimization with "black-box" advice that bridge the gap between the good average-case performance of AI/ML and the worst-case guarantees of traditional algorithms. We focus on the problem of chasing convex bodies and functions, discussing several algorithms and fundamental limits on algorithm performance in this setting.

Bio: Nico Christianson is a third-year PhD student in Computing and Mathematical Sciences at Caltech, where he is advised by Adam Wierman and Steven Low. Before Caltech, he received his AB in Applied Math from Harvard in 2020. He is broadly interested in online algorithms, optimization, and learning, with an emphasis toward designing machine learning-augmented algorithms for complex sequential decision-making problems. Nico's work is supported by an NSF Graduate Research Fellowship.

Join the Seminar

The Data Science Deep Dive is free and open to the public. If you are interested in giving a talk, please email Mohammad Hajiesmaili or Adam Lechowicz. Note that in addition to being a public lecture series, the Data Science Deep Dive is also a seminar (CompSci 692K, Algorithms with Predictions Seminar) that can be taken for credit.