Thursday, January 26, 2012

Random Feedbacks

Phil Schniter sent me the following yesterday following a link to the recent arxiv preprint: Approximate Message Passing under Finite Alphabet Constraints by Andreas Muller, Dino Sejdinovic, Robert Piechocki

"..hi igor,
Regarding the paper "Approximate Message Passing under Finite Alphabet Constraints", it's a great idea to exploit such prior info when available. for fairness, your readers may be be interested to hear that finite-alphabet priors have been part of the GAMPmatlab package (http://gampmatlab.wikia.com/wiki/Generalized_Approximate_Message_Passing) since its inception. moreover, such priors have been used with GAMP to do joint decoding and channel estimation in our work http://www2.ece.ohio-state.edu/~schniter/pdf/jstsp11_ofdm.pdf.
cheers,
phil..."
Thanks Phil. You all probably remembered when Phil schooled me in alphabet issues ( A Small Q&A with Phil Schniter on TurboGAMP. )

I came across this very nice example in the Python based Scikits Learn package of a Compressive sensing: tomography reconstruction with L1 prior (Lasso). Let us recall that another solver written in Python includes the ASPICS toolbox.

Talking about Phil and ASPICS reminded that I drew a graph of the recent events that occurred in 2011 in compressive sensing, both Phil and the folks behind ASPICS (Florent Krzakala, Marc MézardFrançois SaussetYifan Sun, Lenka Zdeborová) played a no small part in these improvements. I don't know if it came out right but here it is:


Anna Gilbert has a new entries on her two recent lectures on compressive sensing.


Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly