Cautious regret minimization: Online optimization with long-term budget constraints

Liakopoulos, Nikolaos; Destounis, Apostolos; Paschos, Georgios; Spyropoulos, Thrasyvoulos; Mertikopoulos, Panayotis
ICML 2019, 36th International Conference on Machine
Learning, 9-15 June 2019, Long Beach, California, USA

We study a class of online convex optimization problems with long-term budget constraints that arise naturally as reliability guarantees or total consumption constraints. In this general setting, prior work by Mannor et al. (2009) has shown that achieving no regret is impossible if the functions defining the agent's budget are chosen by an adversary. To overcome this obstacle, we refine the agent's regret metric by introducing the notion of a "K-benchmark", i.e., a comparator which meets the problem's allotted budget over any window of length K. The impossibility analysis of Mannor et al. (2009) is recovered when K = T; however, for K = o(T), we show that it is possible to minimize regret while still meeting the problem's long-term budget constraints. We achieve this via an online learning algorithm based on cautious online Lagrangian descent (COLD) for which we derive explicit bounds, in terms of both the incurred regret and the residual budget violations.

Long Beach
Systèmes de Communication
Eurecom Ref:
© 2019 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.