Research

My recent research has two main thrusts: graph algorithms and algorithms under uncertainty. A few representative recent projects in these areas are listed below. For a more comprehensive listing, please see my publications.

Graph Algorithms

In graph algorithms, my research focuses on the study of cuts, flows, and connectivity in graphs. Fundamentally, graphs are mathematical abstractions of connected entities such as individuals, computers, autonomous systems, etc. Therefore, a central goal in graph algorithms research is to understand the robustness of these connections and their tolerance to failures. In this context, my work has touched on a variety of fundamental questions such as minimum cuts, cut sparsification, network reliability, maximum flows, etc.

While graph connectivity has been a focal point of algorithms research for many decades, our understanding of these problems varies substantially depending on the following:

  • Directed vs Undirected graphs: For many connectivity problems, most of the progress has been for the class of undirected graphs, and our understanding for the more general directed graphs significantly lags behind.

  • Vertex vs Edge connectivity: In most cases, we have much better algorithms for edge connectivity than for vertex connectivity, although the latter is just as important for studying the inherent robustness of network connections.

  • Graphs vs Hypergraphs: Hypergraphs are a generalization of graphs that capture richer connectivity structures, and are useful in many applications. Nevertheless, our understanding of connectivity problems in hypergraphs is generally poor compared to that for graphs.

Much of my recent work in graph algorithms has focused on trying to bridge these gaps, or at least push the boundaries in the less-understood parts of the graph connectivity landscape.

In terms of tools, my main recent focus has been to use preconditioning. This is a general idea where we seek to transform or view arbitrary instances of a problem in a way that allows us to use intuition from very structured instances. In the case of graph algorithms, the structured instance is often the class of random graphs, and the goal of preconditioning is to view arbitrary graphs in a way that we can use properties of random graphs on them. A typical example of this is the notion of expander decomposition, where an arbitrary graph can decomposed into expanders (that look essentially like random graphs for the purpose of cut algorithms) and the inter-cluster graph connecting these expanders is very sparse and easy to handle algorithmically.

Here are some of my recent papers in this line of research (the complete list is here):

  • Amir Abboud, Robert Krauthgamer, Jason Li, Debmalya Panigrahi, Thatchaphol Saranurak, Ohad Trabelsi.
    Breaking the Cubic Barrier for All-Pairs Max-Flow: Gomory-Hu Tree in Nearly Quadratic Time.
    FOCS 2022.

  • Ruoxu Cen, Jason Li, Debmalya Panigrahi.
    Edge-Connectivity Augmentation in Near-Linear Time. [pdf]
    STOC 2022.

  • Ruoxu Cen, Jason Li, Debmalya Panigrahi.
    Augmenting Edge Connectivity via Isolating Cuts. [pdf]
    SODA 2022.

  • Jason Li, Debmalya Panigrahi, and Thatchaphol Saranurak.
    A Nearly Optimal All-Pairs Minimum Cuts Algorithm in Simple Graphs. [pdf]
    FOCS 2021.

  • Ruoxu Cen, Jason Li, Danupon Nanongkai, Debmalya Panigrahi, Kent Quanrud, and Thatchaphol Saranurak.
    Minimum Cuts in Directed G
    raphs via Partial Sparsification. [pdf]
    FOCS 2021.

  • Ruoxu Cen, Yu Cheng, Debmalya Panigrahi, and Kevin Sun.
    Sparsification of Directed Graphs via Cut Balance. [pdf]
    ICALP 2021.

  • Jason Li and Debmalya Panigrahi.
    Approximate Gomory-Hu tree is Faster than n-1 Max-flows. [pdf]
    STOC 2021.

  • Jason Li, Danupon Nanongkai, Debmalya Panigrahi, Thatchaphol Saranurak, and Sorrachai Yingchareonthawirnchai.
    Vertex Connectivity in Poly-logarithmic Max-flows. [pdf]
    STOC 2021.

  • Jason Li, Debmalya Panigrahi.
    Deterministic Min-cut in Poly-logarithmic Max-flows. [pdf]
    FOCS 2020.

  • Kyle Fox, Debmalya Panigrahi, Fred Zhang.
    Minimum Cut and Minimum k-Cut in Hypergraphs via Branching Contractions. [
    pdf]
    SODA 2019.

Algorithms under uncertainty

While the traditional model of algorithm design assumes that the entire input is known upfront to the algorithm, this is often not the case in practice where algorithms need to solve optimization problems in dynamic environments. The classic framework for modeling this uncertainty is that of online algorithms, where one assumes no knowledge of the future and optimizes for a worst-case input. Other popular frameworks include stochastic optimization, robust algorithms, dynamic algorithms, etc.

Algorithms and Machine Learning

I am particularly interested in the role of machine learning in mitigating the effects of uncertainty on algorithmic performance. In recent years, tremendous advances in ML have given us the ability to predict an uncertain future. But, these predictions come with some caveats: they are almost never precisely correct, and can occasionally be entirely wrong. This motivates the following research themes:

  • Algorithms with ML predictions: Can we design algorithms that take advantage of ML predictions while being robust to the possibility of the predictions being incorrect?

  • ML predictions for Algorithms: Conversely, can we redesign learning algorithms to make good predictions for optimization problems, i.e., to make predictions that improve algorithmic performance rather than aim to minimize standard notions of statistical learning error?

Much of my recent work in algorithms under uncertainty has focused on trying to address these questions. For example, see these recent papers (the complete list is here):

  • Anupam Gupta, Debmalya Panigrahi, Bernardo Subercaseaux, Kevin Sun.
    Augmenting Online Algorithms with ε-Accurate Predictions.
    NeurIPS 2022.

  • MohammadTaghi Hajiaghayi, MohammadReza Khani, Debmalya Panigrahi, Max Springer.
    Online Algorithms for the Santa Claus Problem.
    NeurIPS 2022.

  • Keerti Anand, Rong Ge, Amit Kumar, Debmalya Panigrahi.
    Online Algorithms with Multiple Predictions.
    ICML 2022.

  • Yossi Azar, Debmalya Panigrahi, Noam Touitou.
    Online Graph Algorithms with Predictions. [pdf]
    SODA 2022.

  • Keerti Anand, Rong Ge, Amit Kumar, Debmalya Panigrahi.
    A Regression Approach to Learning-Augmented Online Algorithms. [pdf]
    NeurIPS 2021.

  • Keerti Anand, Rong Ge, Debmalya Panigrahi.
    Customizing ML Predictions for Online Algorithms.
    [pdf]
    ICML 2020.

  • Zhihao Jiang, Debmalya Panigrahi, Kevin Sun.
    Online Algorithms for Weighted Paging with Predictions. [
    pdf]
    ICALP 2020.

  • Sreenivas Gollapudi, Debmalya Panigrahi.
    Online Algorithms for Rent-Or-Buy with Expert Advice. [
    pdf]
    ICML 2019.

Online and Robust Algorithms

I am also generally interested in the design of algorithms that are robust to uncertainty in input, particularly online (where the input requirements arrive over time), robust (where the input is provided upfront but is imprecise or inaccurate), and dynamic (where the input requirements can arbitrarily change over time) algorithms. I have explored this area along the following research themes:

  • Online Algorithms with Delays and Deadlines: Traditionally, online algorithms need to satisfy the requirements appearing online immediately on arrival. But, in practice, many online and dynamic systems only need to satisfy each requirement within a stipulated time in the future, or is subject to a penalty that increases over time if the requests are left unserved. Can we design online algorithms in this scenario?

  • Online and Dynamic Algorithms with Recourse: Online algorithms often suffer from pessimistic lower bounds because all algorithmic decisions are irrevocable. Instead, if one allows a bounded recourse budget to reverse previous decisions, can we approach offline performance guarantees? This is also related to the study of dynamic algorithms where the recourse budget is in terms of the time spent on updating the solution in response to an online input.

  • Robust Algorithms for Input Ranges: Can we design algorithms for problems where the input parameters are not precise but specified as a range of possible values? For instance, in routing on a network, the actual delay on an edge may not be known in advance, but a range of possible delays can be derived from historical performance.

Some of my recent work along these lines of research appear below (the complete list is here):

  • Anupam Gupta, Amit Kumar, Debmalya Panigrahi.
    A Hitting Set Relaxation for k-Server and an Extension to Time Windows. [pdf]
    FOCS 2021.

  • Arun Ganesh, Bruce Maggs, and Debmalya Panigrahi.
    Universal Algorithms for Clustering Problems. [pdf]
    ICALP 2021.

  • Arun Ganesh, Bruce Maggs, Debmalya Panigrahi.
    Robust Algorithms for TSP and Steiner Tree. [
    pdf]
    ICALP 2020.

  • Anupam Gupta, Amit Kumar, Debmalya Panigrahi.
    Caching with Time Windows. [
    pdf]
    STOC 2020.

  • Amir Abboud, Raghavendra Addanki, Fabrizio Grandoni, Debmalya Panigrahi, Barna Saha.
    Dynamic Set Cover: Improved Algorithms and Lower Bounds. [
    pdf]
    STOC 2019.

  • Anupam Gupta, Ravishankar Krishnaswamy, Amit Kumar, Debmalya Panigrahi.
    Elastic Caching. [
    pdf]
    SODA 2019.

  • Yossi Azar, Arun Ganesh, Rong Ge, Debmalya Panigrahi.
    Online Service with Delay. [
    pdf]
    STOC 2017.

  • Anupam Gupta, Ravishankar Krishnaswamy, Amit Kumar, Debmalya Panigrahi.
    Online and Dynamic Algorithms for Set Cover. [
    pdf]
    STOC 2017.