Computer Science Thesis Oral

Wednesday, July 28, 2021 - 1:00pm to 3:00pm


Virtual Presentation - ET Remote Access - Zoom



Automated algorithm and mechanism configuration

Algorithms often have tunable parameters that control performance metrics such as solution quality and runtime. Typically, the optimal setting depends intimately on the application at hand. Hand-tuning parameters is often tedious, time consuming, and error prone, so a burgeoning line of research has studied automated algorithm configuration via machine learning, where a training set of typical problem instances is used to find high-performing parameter settings.

In this thesis, we help develop the theory and practice of automated algorithm configuration. We investigate both statistical and algorithmic questions. For example, how large should the training set be to ensure that an algorithm's average performance over the training set is indicative of its future performance? How can we algorithmically find provably high-performing configurations? While answering these questions, we analyze parameterized algorithms from diverse domains, including:

  1. Integer programming. Highly-parameterized tree search algorithms are the most widely used tools for solving integer programs. We provide experiments demonstrating that tuning parameters using our approach can have a significant impact on algorithmic performance. We also analyze widely-applicable approximation algorithms for integer quadratic programming.
  2. Mechanism design. A mechanism is a special type of algorithm designed to help a set of agents come to a collective decision. For example, a mechanism might dictate how a set of items should be split among the agents. Mechanisms often have tunable parameters that impact, for example, their revenue. No one parameter setting is universally optimal. We analyze sales mechanisms, where the goal is to maximize revenue, and voting mechanisms, where the goal is to maximize social welfare.
  3. Computational biology. Algorithms from computational biology often come with many parameters. Understanding which parameter settings are optimal in which scenarios is an active area of research.  We analyze parameterized algorithms for several fundamental problems, including sequence alignment and RNA folding.

The key challenge from both an algorithmic and statistical perspective is that an algorithm's performance is typically an erratic function of its parameters. This is because a small tweak to the parameters can cause a cascade of changes in the algorithm's performance. We develop tools for analyzing and optimizing these volatile performance functions.

Thesis Committee:
Maria-Florina Balcan (Co-chair)
Tuomas Sandholm (Co-chair)
Ameet Talwalkar
Eric Horvitz (Microsoft Research)
Kevin Leyton-Brown (University of British Columbia)

Zoom Participation. See announcement.

For More Information, Contact:


Thesis Oral