Topics in Optimization, Spring 2018

Course Outline

Time and location

Time: Tuesday, Friday 12:00 - 1:50pm

Location: DARRIN 236


Yangyang Xu

Office: Amos Eaton 310

Office hour: TF 3:00pm - 4:00pm or by appointment


Programming assignments

Reading materials

  • A fast iterative shrinkage-thresholding algorithm for linear inverse problems, Beck and Teboulle, 2009.

  • Gradient methods for minimizing composite functions, Nesterov, 2012.

  • Convergence rates of inexact proximal-gradient methods for convex optimization, Schmidt, Roux, and Bach, 2011.

  • Proximal Newton-type methods for convex optimization, Lee, Sun, and Saunders, 2012.

  • Robust stochastic approximation approach to stochastic programming, Nemirovski, Juditsky, Lan, and Shapiro, 2009.

  • Stochastic first and zeroth-order methods for nonconvex stochastic programming, Ghadimi and Lan, 2013.

  • Accelerating stochastic gradient descent using predictive variance reduction, Johnson and Zhang, 2013.

  • SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives, Defazio, Bach, and Lacoste-Julien, 2014.

  • Minimizing finite sums with the stochastic average gradient, Schmidt, Roux, and Bach, 2017.