EN.580.709 - Fall 2019 - Johns Hopkins University
Jeremias Sulam
M 3:00PM - 5:00PM Credits: 2
*Note: this is the Fall’19 course description. Course details for the upcoming Fall’20 might be subject to some changes.
Course Description
Sparse and redundant representations constitutes a fascinating area of research in signal and image processing. This is a relatively young field that has been taking form for the last 15 years or so, with contributions from harmonic analysis, numerical algorithms and machine learning, and has been vastly applied to myriad of problems in computer vision and other domains. This course will focus on sparsity as a model for general data, generalizing many different other constructions or priors. This idea - that signals can be represented with just a few coefficients - leads to a long series of beautiful (and surprisingly, solvable) theoretical and numerical problems, and many applications that can benefit directly from the new developed theory. In this course we will survey this field, starting with the theoretical foundations, and systematically covering the knowledge that has been gathered in the past years. This course will touch on theory, numerical algorithms, and applications in image processing and machine learning. Recommended course background: Linear Algebra, Signals and Systems, Numerical Analysis.
Class Material and references
Also, you can find the draft class notes here.
Content and Syllabus
This course will cover the fundamental ideas behind sparse signal and image modeling, as well as key ideas of sparse priors in machine learning, and their applications. The (tentative) roadmap is the following:
- Course introduction. Intro to underdetermined linear systems of equations, sparsity and math warm-up.
- Uniqueness and Stability.
- Greedy algorithms and their analysis.
- Basis Pursuit - success guarantees and stability. Compressed Sensing. Analysis and convergence of ISTA.
- Intro to statistical learning, sparse linear and logistic regression, variable selection.
- Lasso generalizations, elastic net, matrix and spectral sparsity and applications.
- Dictionary Learning - Early motivations, MOD, K-SVD, Online Dictionarly Learning, Double Sparsity.
- Computer Vision Applications: image denoising, inpaing, compression, MCA, Task Driven Dictionary Learning.
- Convolutional sparse coding and dictionary learning, and their application.
- MultiLayer sparsity based deep learning, theoretical guarantees and applications.
- Project work.
- Guest Lecture.
- Project Presentations.
- Project Presentations.
Course Requirements and Grading
The course will be organized in weekly lectures, and students are expected to actively participate in them. Grading will be assigned according to:
a) Final exam (40%), intended to serve as a fast check for contents presented in class.
b) Course Project (60%), to be carried out preferably in pairs, based on recently published papers. A list of candidate papers to choose from is given here, (others can be accommodated upon request). Evaluation of the project will be based on a final report summarizing some of these papers, their contributions, and your own findings (open questions, simulation results, extensions, etc), as well as a short presentation to the class. Detailed project guidelines and evaluation criteria can be found here.
References
- Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing, by M. Elad
- Statistical Learning with Sparsity: The Lasso and Generalizations by T. Hastie et al.
- A Mathematical Introduction to Compressive Sensing, by S. Foucart and H. Rauhut
- Recent papers
Ethics, personall well-being and classroom climate
Please, have a look at the detailed course description for information on these important topics.