The FAST Algorithm for Submodular Maximization

Jul 12, 2020



In this paper we describe a new parallel algorithm called Fast Adaptive Sequencing Technique (FAST) for maximizing a monotone submodular function under a cardinality constraint k. This algorithm achieves the optimal 1-1/e approximation guarantee and is orders of magnitude faster than the state-of-the-art on a variety of experiments over real-world data sets. In the past two years, following the work by Balkanski and Singer there has been a great deal of work on algorithms whose theoretical parallel runtime is exponentially faster than algorithms used for submodular maximization over the past 40 years. Although these algorithms are fast in terms of asymptotic worst case guarantees, it is computationally infeasible to use them in practice. The reason is that the number of rounds and queries they require depends on very large constants as well as high-degree polynomials in terms of the precision and confidence, causing these algorithms to be impractical even on small data sets. The design principles behind the FAST algorithm we present here are a significant departure from those of theoretically fast algorithms that have been studied in the past two years. Rather than optimize for theoretical guarantees, the design of FAST introduces several new techniques that achieve remarkable practical and theoretical parallel runtimes. More specifically, the approximation guarantee obtained by FAST is arbitrarily close to 1 − 1/e, its theoretical parallel runtime (adaptivity) is O(log(n) log^2(log k)), and the total number of queries is O(n log log(k)). We show that FAST is orders of magnitude faster than any algorithm for submodular maximization we are aware of, including hyper-optimized parallel versions of state-of-the-art serial algorithms, by running experiments on large data sets.



About ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%


Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2020