Laurent just sent me the following the other day:
Following the high quality applications (and practical results in only 3 months) after your post on:
let me share with you the follow-up internship (5 month):
Analysis and prediction of wavelet and filter-bank frames performance for machine learning
and full PDF description
with a subsequent PhD proposal (Data characterization and classification with invariant multiscale features)
Objectives and context follow (more at the webpage)
We wish to study large datasets of experimental data (e.g. physico-chemical spectral signals, microscopy or geophysical subsurface images) toward clustering, classification and learning. When data satisfy regularity properties, they often admit sparse or compressible representations in a judicious transformed domain: a few transformed coefficients provide accurate data approximation. Such representations, like multiscale or wavelet transforms, are beneficial to subsequent processing, and they form the core of novel data processing methodologies, such as Scattering networks/transforms (SN) or Functional Data Analysis (FDA). Due to the variety of such transforms, without prior knowledge, it is not evident to find the most suitable representation for a given set of data. The aim of this subject is to investigate potential relations between transform properties and data compressibility on the one hand, and classification/clustering performance on the other hand, especially with respect to the robustness to shifts/translations or noise in data features, with matters in experimental applications. Rooting on a recent work, the first objective is to develop a framework to allow the use of different sparsifying transformations (bases or frames of wavelets and multiscale transformations) at the input of reference SN algorithms. This will permit to evaluate the latter on a variety of experimental datasets, with the aim of choosing the most appropriate, both in terms of performance and usability, since the redundancy in transformations may hinder their application to large datasets. A particular interest could be laid on complex-like transformations, that may improve either the sparsification or ”invariance properties” in the transformed data. Their importance has been underlined recently for deep convolutional networks. Then, starting from real data, the trainee will develop realistic models reproducing the expected behaviors in the data, for instance related to shifts or noise. Finally, the relative clustering/classification performances will be assessed with respect to different trans- formation choices, and their impact on both realistic models and real data. A particular interest could be laid on either transform properties (redundancy, frame bounds, asymptotic properties) or the resulting data multiscale statistics.
Hoping you can still disclose this information with your venture at http://www.lighton.io/
Sure Laurent ! The more qualified people around Paris, the better (a tide lifts all boats).
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.