Monday, March 27, 2017

CSjobs: Postdocs, Signal and Image Processing Institute, University of Southern California

Justin just sent me the following:

Hi Igor,
We're currently looking for postdocs with a strong background in computational imaging/compressive sensing to work on a major new funded project. Would you be willing to advertise this on your blog? A description of the position can be found below
Thanks, and much appreciated!
--
Justin Haldar
Assistant Professor
Signal and Image Processing Institute
Departments of Electrical Engineering
and Biomedical Engineering
University of Southern California

Sure Justin !

A press release for the project can be found at: ISI Selected to Participate in New IARPA (RAVEN) Program

The job description is given below:

Post-Doctoral Research Associate
Signal and Image Processing Institute
University of Southern California
Several Postdoctoral Research Associate positions are available immediately for an exciting new government funded project with the goal of 3D coherent x-ray imaging of silicon integrated circuits at <10nm a="" algorithm="" an="" analysis="" and="" argonne="" articles.="" as="" br="" candidates="" computational="" data.="" data="" development="" evaluation="" experimental="" focus="" from="" image="" implementations="" including="" institute="" interdisciplinary="" involve="" labs="" modeling="" multi-institution="" national="" northwestern="" of="" on="" part="" paul="" position="" preparation="" reconstruction="" research="" resolution.="" scherrer="" simulation="" software="" sparsely-sampled="" successful="" system="" team="" the="" university="" usc="" will="" with="" work="">Required Qualifications: PhD in Electrical Engineering, Statistics, Computer Science, or Physics. Programming experience, preferably including Matlab, Python, C++. Experience and publications in at least one of the following areas: computational imaging, 3D tomographic image reconstruction, inverse problems, low-dimensional signal representations (sparsity, low-rank, etc.), numerical optimization, diffractive optics, optical simulation, coherent diffraction imaging, phaseless imaging, and ptychography,
Successful applicants will join the Signal and Image Processing Institute in the Department of Electrical Engineering and work with a team of faculty including Richard Leahy, Anthony Levi, Justin Haldar and Mahdi Soltanolkotabi.
The University of Southern California strongly values diversity and is committed to equal opportunity in employment. Women and men, and members of all racial and ethnic groups, are encouraged to apply.
Send applications to:
Richard M. Leahy, Ph.D.
Professor and Director
Signal and Image Processing Institute
3740 McClintock Ave, EEB400
University of Southern California
Los Angeles, CA 90089 2564
http://neuroimage.usc.eduleahy@sipi.usc.edu


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Data Science Summer School, Ecole Polytechnique, France, August 28th- September 1st, 2017




Julie tells me of this event; 

The data science initiative of Ecole Polytechnique organizes a Data Science Summer School from the 28 of August to the 1st of September 2017. http://www.ds3-datascience-polytechnique.fr/ The primary focus of the event is to provide a series of courses and talks covering the latest advances in the field presented by leading experts of the area, with a special session on Data Science for Smart Grids, and several networking facilities. The event is targeted for MSc2 and PhD students, postdocs, academics, members of public institutions, and professionals. 
Courses:
  • Yoshua BENGIO deep learning
  • Pradeep RAVIKUMAR graphical models
  • Peter RICHTÁRIK optimization models
  • Csaba SZEPESVÁRI bandits
Talks: 
  • Cédric ARCHAMBEAU 
  • Olivier BOUSQUET 
  • Damien ERNST 
  • Laura GRIGORI
  • Sean MEYN 
  • Sebastian NOWOZIN 
  • Stuart RUSSELL 
Key Dates: 
  • Application deadline: Apr. 20, 2017. 
  • Notification of acceptance by May 7, 2017. 
  • Event: Monday, Aug. 28 - Friday, Sept. 1, 2017 





Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

CSjob; Internship (Spring/Summer/Fall 2017), IFP Energies nouvelles, France

Laurent just sent me the following:

Dear Igor


I have a late internship proposal at IFP Energies nouvelles. I would be delighted if you could advertise it. The update page is:
http://www.laurent-duval.eu//lcd-2017-intern-sparse-regression-dim-reduction.html
and the pdf file is here:
http://www.laurent-duval.eu//Documents/IFPEN_2017_SUBJ_Robust-sparse-regression.pdf


A text (same as the webpage if you need html code): 
Sparse regression and dimension reduction for sensor measurements and data normalization

The instrumental context is that of multiple 1D data or measurements ym related to the the same phenomenon x, corrupted by random effects nm and a different scaling parameter am, due to uncontrolled sensor calibrations or measurement variability. The model is thus: 
ym(k) = am x(k) + nm(k) .

The aim of the internship is to robustly estimate scaling parameters am (with confidence bounds) in the presence of missing data or outliers for potentially small, real-life signals x with large amplitude variations. The estimation should be as automatized as possible, based on data properties and priors (e.g. sparsity, positivity), so as to be used by non-expert users. Signals under study are for instance: vibration, analytical chemistry or biological data. Of particular interest for this internship is the study and performance assessment of robust loss or penalty functions (around the l2,1-norm) such as the R1-PCA or low-rank decomposition.


Best


Sure Laurent !

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Saturday, March 25, 2017

Saturday Morning Video: #NIPS2016 Symposium, Recurrent Neural Networks and Other Machines that Learn Algorithms

From the page of the minisymposium



Program
Full session videos are available here: Session 1Session 2Session 3.
We provide individual videos and slides below. You can also watch this Playlist.
2:00 - 2:20 Jürgen Schmidhuber 
Introduction to Recurrent Neural Networks and Other Machines that Learn Algorithms 
Slides Video
2:20 - 2:40 Paul Werbos 
Deep Learning in Recurrent Networks: From Basics To New Data on the Brain 
Slides Video
2:40 - 3:00 Li Deng 
Three Cool Topics on RNN 
Slides Video
3:00 - 3:20 Risto Miikkulainen 
Scaling Up Deep Learning through Neuroevolution 
Slides Video
3:20 - 3:40 Jason Weston 
New Tasks and Architectures for Language Understanding and Dialogue with Memory 
Slides Video
3:40 - 4:00 Oriol Vinyals 
Recurrent Nets Frontiers 
Slides Unavailable Video
 
4:00 - 4:30 Coffee Break
 
4:30 - 4:50 Mike Mozer 
Neural Hawkes Process Memories 
Slides Video
4:50 - 5:10 Ilya Sutskever 
Meta Learning in the Universe 
Slides Video
5:10 - 5:30 Marcus Hutter 
Asymptotically fastest solver of all well-defined problems 
Slides Video
(unfortunately cannot come - J. Schmidhuber will stand in for him)
5:30 - 5:50 Nando de Freitas 
Learning to Learn, to Program, to Explore and to Seek Knowledge 
Slides Video
5:50 - 6:10 Alex Graves 
Differentiable Neural Computer 
Slides Video
 
6:30 - 7:30 Light dinner break/Posters
 
7:30 - 7:50 Nal Kalchbrenner 
Generative Modeling as Sequence Learning 
Slides Video
7:50 - 9:00 Panel Discussion 
Topic: The future of machines that learn algorithms
Panelists: Ilya Sutskever, Jürgen Schmidhuber, Li Deng, Paul Werbos, Risto Miikkulainen, Sepp Hochreiter 
Moderator: Alex Graves 
Video








Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Saturday Morning Video: The Role of Multi-Agent Learning in Artificial Intelligence Research at DeepMind


Thore Graepel talks about  The Role of Multi-Agent Learning in Artificial Intelligence Research at DeepMind
The video cannot be embedded, so here is the link: https://youtu.be/CvL-KV3IBcM




Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Friday, March 24, 2017

Around The Blogs In 78 Hours


Yes, this is an issue and the blogs are helping seeing through some of it:

Jort and his team have released Audioset

Thomas
Bob

Muthu
Laurent
Sanjeev

Mitya

Felix

Adrian

Ferenc

Francois 
Thibaut
Terry

Here is an 'old' blog entry from Dustin on some of Yves' work in compressed sensing
Dustin






Thursday, March 23, 2017

Evolution Strategies as a Scalable Alternative to Reinforcement Learning - implementation -




We explore the use of Evolution Strategies, a class of black box optimization algorithms, as an alternative to popular RL techniques such as Q-learning and Policy Gradients. Experiments on MuJoCo and Atari show that ES is a viable solution strategy that scales extremely well with the number of CPUs available: By using hundreds to thousands of parallel workers, ES can solve 3D humanoid walking in 10 minutes and obtain competitive results on most Atari games after one hour of training time. In addition, we highlight several advantages of ES as a black box optimization technique: it is invariant to action frequency and delayed rewards, tolerant of extremely long horizons, and does not need temporal discounting or value function approximation.





Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !

Implementation: Compressed Sensing using Generative Models


Alex just mentioned the availability of the code for the recent Compressed Sensing using Generative Models.



It's all here: https://github.com/AshishBora/csgm

Enjoy !



Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Wednesday, March 22, 2017

Paris Machine Learning Hors Serie #10 : Workshop SPARK (atelier 1)





Leonardo Noleto, data scientist chez KPMG, nous fait découvrir le processus de nettoyage et transformation des données brutes en données “propres” avec Apache Spark.

Apache Spark est un framework open source généraliste, conçu pour le traitement distribué de données. C’est une extension du modèle MapReduce avec l’avantage de pouvoir traiter les données en mémoire et de manière interactive. Spark offre un ensemble de composants pour l’analyse de données: Spark SQL, Spark Streaming, MLlib (machine learning) et GraphX (graphes).

Cet atelier se concentre sur les fondamentaux de Spark et le paradigme de traitement de données avec l’interface de programmation Python (plus précisément PySpark).

L’installation, configuration, traitement sur cluster, Spark Streaming, MLlib et GraphX ne seront pas abordés dans cet atelier.

Matériel à installer c'est ici. ..


Objectifs 

  • Comprendre les fondamentaux de Spark et le situer dans l'écosystème Big Data ;
  • Savoir la différence avec Hadoop MapReduce ;
  • Utiliser les RDD (Resilient Distributed Datasets) ;
  • Utiliser les actions et transformations les plus courantes pour manipuler et analyser des données ;
  • Ecrire un pipeline de transformation de données ;
  • Utiliser l’API de programmation PySpark.


Cet atelier est le premier d’une série de 2 ateliers avec Apache Spark. Pour suivre les prochains ateliers, vous devez avoir suivi les précédents ou être à l’aise avec les sujets déjà traités.


Quels sont les pré-requis ?


  • Connaître les base du langage Python (ou apprendre rapidement via ce cours en ligne Python Introduction)
  • Être sensibilisé au traitement de la donnée avec R, Python ou Bash (why not?)
  • Aucune connaissance préalable en traitement distribué et Apache Spark n’est demandée. C’est un atelier d’introduction. Les personnes ayant déjà une première expérience avec Spark (en Scala, Java ou R) risquent de s'ennuyer (c’est un atelier pour débuter).


Comment me préparer pour cet atelier ?


  • Vous devez être muni d’un ordinateur portable relativement moderne et avec minimum 4 Go de mémoire, avec un navigateur internet installé. Vous devez pouvoir vous connecter à Internet via le Wifi.
  • Suivre les instructions pour vous préparer à l’atelier (installation Docker + image docker de l’atelier).
  • Les données à nettoyer sont comprises dans l’image Docker. Les exercices seront fournis lors de l’atelier en format Jupyter notebook. 
  • Le notebook est ici: https://goo.gl/emkoee

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Summer School "Structured Regularization for High-Dimensional Data Analysis" - IHP Paris - June 19th to 22nd

Gabriel just sent me the following:

Dear Igor,
In case you find it suitable, could you advertise for this summer school ?
All the best
Gabriel

Sure !
=======
=======
The SMF (French Mathematical Society) and the Institut Henri Poincaré organize a mathematical summer school on "Structured Regularization for High-Dimensional Data Analysis". This summer school will be the opportunity to bring together students, researchers and people working on High-Dimensional Data Analysis around three courses and four talks on new methods in structured regularization. The mathematical foundations of this event will lie between probability, statistics, optimization, image and signal processing.
More information (including registration, free but mandatory) is available on the webpage: https://regularize-in-paris.github.io/
Courses:
  • * Anders Hansen (Cambridge)
  • * Andrea Montanari (Stanford)
  • * Lorenzo Rosasco (Genova and MIT)
Talks:
  • * Francis Bach (INRIA and ENS)
  • * Claire Boyer (UPMC)
  • * Emilie Chouzenoux (Paris Est)
  • * Carlos Fernandez-Granda (NYU)
Organizers:
  • * Yohann De Castro (Paris-Sud)
  • * Guillaume Lecué (CNRS and ENSAE)
  • * Gabriel Peyré (CNRS and ENS)
The program is here; https://regularize-in-paris.github.io/program/


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Printfriendly