Quadratic Memory is Necessary for Optimal Query Complexity in Convex Optimization: Center-of-Mass is Pareto-Optimal

Wednesday, April 5, 2023 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Moïse Blanchard

Building and Room Number

LIDS Lounge

We give query complexity lower bounds for convex optimization and the related feasibility problem. We show that quadratic memory is necessary to achieve the optimal oracle complexity for first-order convex optimization. In particular, this shows that center-of-mass cutting-planes algorithms in dimension d which use O(d^2) memory and O(d) queries are Pareto-optimal for both convex optimization and the feasibility problem, up to logarithmic factors. Precisely, we prove that to minimize 1-Lipschitz convex functions over the unit ball to 1/d^4 accuracy, any deterministic first-order algorithms using at most d^{2−δ} bits of memory must make Ω(d^{1+δ/3}) queries, for any δ∈[0,1]. For the feasibility problem, in which an algorithm only has access to a separation oracle, we show a stronger trade-off: for at most d^{2−δ} memory, the number of queries required is Ω(d^{1+δ}). This resolves a COLT 2019 open problem of Woodworth and Srebro.

Moïse is a 4th year PhD student at the ORC and LIDS, working with Prof Patrick Jaillet. Prior to joining MIT, he received his bachelor’s and master's degree as valedictorian of Ecole Polytechnique in France. His research is generally about learnability in machine learning and statistical learning. In particular, some of his work that aims to characterize minimal assumptions on data to achieve consistency for online supervised learning was awarded runner-up best student paper award at COLT 2022.