TY - CHAP A1 - Diakonikolas, Jelena A1 - Carderera, Alejandro A1 - Pokutta, Sebastian T1 - Locally Accelerated Conditional Gradients T2 - Proceedings of AISTATS Y1 - 2020 N1 - URL of the Code: https://colab.research.google.com/drive/1ejjfCan7xnEhWWJXCIzb03CwQRG9iW_O N1 - URL of the PDF: http://proceedings.mlr.press/v108/diakonikolas20a/diakonikolas20a.pdf N1 - URL of the Slides: https://app.box.com/s/gphkhapso7d1vrfnzqykkb3vx0agxh8w N1 - URL of the Abstract: http://www.pokutta.com/blog/research/2019/07/04/LaCG-abstract.html N1 - https://slideslive.com/38930107/locally-accelerated-conditional-gradients?ref=account-folder-52123-folders ER - TY - JOUR A1 - Carderera, Alejandro A1 - Pokutta, Sebastian T1 - Second-order Conditional Gradient Sliding N2 - Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the \emph{Second-Order Conditional Gradient Sliding} (SOCGS) algorithm, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires O(log(log1/ε)) first-order and Hessian oracle calls and O(log(1/ε)log(log1/ε)) linear minimization oracle calls to achieve an ε-optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly. Y1 - 2020 ER -