Paper
Fast Frank--Wolfe Algorithms with Adaptive Bregman Step-Size for Weakly Convex Functions
Published Apr 6, 2025 · Shota Takahashi, S. Pokutta, Akiko Takeda
1
Citations
0
Influential Citations
Abstract
We propose a Frank--Wolfe (FW) algorithm with an adaptive Bregman step-size strategy for smooth adaptable (also called: relatively smooth) (weakly-) convex functions. This means that the gradient of the objective function is not necessarily Lipschitz continuous, and we only require the smooth adaptable property. Compared to existing FW algorithms, our assumptions are less restrictive. We establish convergence guarantees in various settings, such as sublinear to linear convergence rates, depending on the assumptions for convex and nonconvex objective functions. Assuming that the objective function is weakly convex and satisfies the local quadratic growth condition, we provide both local sublinear and local linear convergence regarding the primal gap. We also propose a variant of the away-step FW algorithm using Bregman distances over polytopes. We establish global faster (up to linear) convergence for convex optimization under the H\"{o}lder error bound condition and its local linear convergence for nonconvex optimization under the local quadratic growth condition. Numerical experiments demonstrate that our proposed FW algorithms outperform existing methods.
Our adaptive Bregman step-size strategy for weakly convex functions outperforms existing methods, providing faster convergence rates and less restrictive assumptions.
Full text analysis coming soon...