Several algorithmic results in the area can, at a high level, be interpreted as a meta-algorithm for "combining" a number of different algorithms at runtime. The individual algorithms usually trust the predictions to a different degree, ranging from blind trust to completely ignoring them. How these algorithms are combined generally depends on the particular problem at hand and how well each individual algorithm performed on the input encountered so far. In this talk, we illustrate this approach on Metrical Task Systems (MTS), a rich class containing several fundamental problems in online optimization as special cases, including caching, k-server, convex body chasing, and convex function chasing. We furthermore give an overview of several variations of this approach from the literature, and highlight differences between these variations.
Bio: Antonios Antoniadis is an assistant professor at the Discrete Mathematics and Mathematical Programming (DMMP) group at the University of Twente in the Netherlands. He received his Ph.D. in Computer Science from the Humboldt University in Berlin, Germany in 2012. After his Ph.D. and before joining the University of Twente in April 2021 he held PostDoc positions at the University of Pittsburgh, the Max-Planck-Institute for Informatics and Bonn University and an interim professorship at the University of Cologne.
Antonios' research interests lie in designing online and approximation algorithms with a particular focus on problems related to energy-efficiency, scheduling theory and more recently learning augmented algorithms.
The Data Science Deep Dive is free and open to the public. If you are interested in giving a talk, please email Mohammad Hajiesmaili or Adam Lechowicz. Note that in addition to being a public lecture series, the Data Science Deep Dive is also a seminar (CompSci 692K, Algorithms with Predictions Seminar) that can be taken for credit.