Thursday, May 25, 2017

A Semismooth Newton Method for Fast, Generic Convex Programming - implmentation -

Eric just mentioned it on his twitter feed:



Due to their generality, conic optimization problems, which can represent most convex optimization problems encountered in practice, have been the focus of much recent work, and additionally form the basis of many convex optimization modeling frameworks. In this paper, we introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by SCS, a state-of-the-art, iterative conic solver, as a fixed point iteration, and then use a nonsmooth Newton method to find a fixed point. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS, on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.

The Github repository is here: https://github.com/locuslab/newton_admm





Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly