Wednesday, June 18, 2014

The SwAMP Thing! Sparse Estimation with the Swept Approximated Message-Passing Algorithm -implementation -




Florent just let me know of the following paper and attendant implementation:


Dear Igor

As you know, the AMP/GAMP approach is a very good one for sparse estimation but often suffer fro convergence problems.

With Lenka, Eric and Andre, we have been working on a modified update scheme for AMP/GAMP where we update the estimates "sequentially" rather than "in parallel". The resulting algorithm, though different from AMP just in few re-ordering details, has very good convergence performances. In fact, it seems to converge for all the cases we have tried.


More information can be found on our arxiv preprint here: http://arxiv.org/abs/1406.4311 and our Matlab implementation is on github: https://github.com/eric-tramel/SwAMP-Demo

Comment welcome

Regards

Florent
-----------------------------------------------
Florent KRZAKALA http://krzakala.org 
Laboratoire de Physique Statistique
Ecole Normale Supérieure, Paris


Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to stabilizing AMP in these contexts by applying AMP updates to individual coefficients rather than in parallel. Our results show that this change to the AMP iteration can provide theoretically expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality.

Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly