Overview: a Sparse Tour of Signal Processing
Signal and Image Processing with Orthogonal Decompositions
- Continuous signals and images
- Orthogonal representations
-
decomposition
- energy conservation (Parseval)
- Fourier and wavelets
- 2D wavelet transform
-
Linear and nonlinear approximations
- filtering
- thresholding
- compressibility and its relation to smoothness
-
Denoising and inverse problems
- signal model
- recovery by sparsity promotion
Reading material
Chapter 1 from Advanced Signal, Image and Surface Processing
Slides
Signal and Image Decomposition in Orthogonal Bases adapted from here.
Recordings
Fourier Processing
- Continuous/discrete Fourier basis
- Sampling
- 2D Fourier transform
- Fourier Approximation
Reading material
Chapter 2 from Advanced Signal, Image and Surface Processing. If you want more detail read Fourier Transforms from Mathematical Foundations of Data Sciences.
Assignments
See assignment tab.
Slides
Fourier Processing adapted from here.
Wavelet Processing
- 1D Multiresolutions
- detail spaces
- Haar wavelets
- 1D Wavelet transform
- computing wavelet coefficients
- discrete wavelet coefficients
- fast wavelet transform -inverse transform
- Filter banks
- approximation filter
- detail filter
- vanishing moments
- Daubechies wavelets
- Extension to 2D
-anisotropic wavelets
- 2D multiresolutions
- 2D wavelet bases
- 2D discrete wavelet coefficients
- fast 2D wavelet transform
Reading material
Chapter 3 from Advanced Signal, Image and Surface Processing. If you want more detail read Wavelets from Mathematical Foundations of Data Sciences.
Assignments
See assignment tab.
Slides
Wavelet Processing adapted from here.
Approximation with Orthogonal Decompositions
- Linear and nonlinear approximations
- Sparse approximation in a basis
- hard thresholding
- decay of approximation errors (linear vs. nonlinear)
- Relation between decay rate coefficients and approximation error
- Fourier for smooth functions
- Fourier and singularities/edges
- wavelet transform for peice-wise 1D smooth functions
- vanishing moments
- magnitude of wavelet coefficients and its relation to edges
- decay and nonlinear approximation error for piece-wise 1D functions
- Piece-wise smooth 2D functions
- 2D approximations -decay and nonlinear approximation error for piece-wise 2D functions
- Geometrically regular functions
- space of BV functions
- finite elements
- curvelets, their construction, and properties
Reading material
Chapter 4 from Advanced Signal, Image and Surface Processing. If you want more detail read Linear and Non-linear Approximation from Mathematical Foundations of Data Sciences.
Assignments
See assignment tab.
Slides
Approximation and Coding with Orthogonal Decompositions adapted from here.
Linear and Nonlinear Denoising
- Linear denoising
- additative noise model
- linear denoising by smoothing
- Wiener filter and oracle estimation of optimal filter
- Nonlinear denoising
- hard thresholding
- optimal threshold selection
- nonlinear approximation and estimation
- hard vs. soft thresholding
- Translational invariant thresholding
- translation invariant wavelets
- optimal threshold
- Other diagonal estimators
- between hard and soft thresholding
- Non-diagonal estimators
- block thresholding
Reading material
Chapter 5 from Advanced Signal, Image and Surface Processing. If you want more detail read Denoising from Mathematical Foundations of Data Sciences.
Assignments
See assignment tab.
Slides
Linear and nonlinear denoising adapted from here.
Variational Regularization of Inverse Problems
- Variational priors
- smooth and cartoon
- natural image priors
- discretization
- Variational regularization
- regularized inverse
- pseudo inverse
- Sobolev regularization and inpainting
- total variation regularization and inpainting
- Example from tomography with the Radon transform
Reading material
Chapter 6 from Advanced Signal, Image and Surface Processing. If you want more detail read Variational Priors and Regularization from Mathematical Foundations of Data Sciences.
Assignments
See assignment tab.
Slides
Inverse problems Regularization adapted from here.
Sparse Regularization of Inverse Problems
- Linear inverse problems
- denoising
- inpainting
- super- resolution
- Regularization of inverse problems
- regularized inverse
- Lagrangian formulation including the the Lagrange multiplier/trade-off parameter lambda
- smooth and cartoon priors
- Redundant dictionaries
- Sparse priors
- convex relaxation of the \(\ell_0-norm\) via the \(\ell_1\)-norm
- \(\ell_1\)-regularization and sparse recovery
- noise-free sparsity-promoting regularization
- Iterative soft thresholding
Reading material
Chapter 7 from Advanced Signal, Image and Surface Processing. If you want more detail read Inverse Problems and Sparse Regularization from Mathematical Foundations of Data Sciences.
Assignments See assignment tab.
Slides
Sparse regularization of Inverse Problems adapted from here.
Compressive Sensing
- Classical sampling
- discretization
- point-wise sampling and smoothness
- Compressive acquisition
- examples single pixel camera
- physical model
- Inversion and sparsity
- \(\ell_1\) prior
- sparse CS recovery
- Theoretical guarantees
- CS with restricted isometry constants (RIP)
- singular-value distributions
- RIP for Gaussian matrices
- numerics with RIP
- Fourier measurements
- MRI imaging
- radar interferometry
- structured measurements
Assignments
See assignment tab.
Reading material
Compressive Sensing Chapter from Mathematical Foundations of Data Sciences.
Slides
Compressive Sensing adapted from here.
Bayesian Inference
Reading material
Slides
Statistical Regularization
The review article Solving inverse problems using data-driven models serves as the major reading material. For a mathematical description of the February 6 and 11 lectures, refer to pages 22 till 43 from this article.
Learning in Functional Analytic Regularization
During the lecture of February 13, we start by considering section 5.1.2 Deep direct Bayes estimation on page 84 of Solving inverse problems using data-driven models untile Regularization by learning. We continue with section Deep direct estimation of higher-order moments on page 99. Next, we consider loop unrolling described in section 4.9 Data-driven optimization on page 66 until section 4.10 and later in section 5.1.4 Learned iterative schemes on page 89 until section 5.1.5. The supervised training, learned prior/regularizer, unsupervised learning, and semi-supervised learning are briefly introduced in section 5.1 Learning an estimator on page 82.
Learning in Statistical Regularization
Slides
We use a combination of slides developed mainly by Ozan Oktem and Jonas Adler and others. The slides presented in class can be found here.
Papers
Untrained Neural Networks
- Deep Image Prior
- A Bayesian Perspective on the Deep Image Prior
- DeepRED: Deep Image Prior Powered by RED
- Algorithmic Guarantees for Inverse Imaging with Untrained Network Priors
Loop-Unrolled Neural Networks
- Learning to learn by gradient descent by gradient descent
- Invert to Learn to Invert
- Recurrent Inference Machines for Solving Inverse Problems
- Model based learning for accelerated, limited-view 3D photoacoustic tomography
- Multi-Scale Learned Iterative Reconstruction
- Learned primal-dual reconstruction
- Deep bayesian inversion
- Solving ill-posed inverse problems using iterative deep neural networks
- Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems
Normalizing Flows
-
First Papers
-
Conditional Normalizing Flows
-
Continuous Normalizing Flows
-
Applications
-
Review papers
-
SOTA
Diffusion Models (Equivalent to certain SDE models)
- Denoising Diffusion Probabilistic Models
- Diffusion Models Beat GANs on Image Synthesis
- Deblurring via Stochastic Refinement
Stochastic Differential Equations Generative Models (Equivalent to certain Diffusion models)
- Neural Ordinary Differential Equations
- FFJORD: FREE-FORM CONTINUOUS DYNAMICS FOR SCALABLE REVERSIBLE GENERATIVE MODELS
- SCORE-BASED GENERATIVE MODELING THROUGH STOCHASTIC DIFFERENTIAL EQUATIONS
- SOLVING INVERSE PROBLEMS IN MEDICAL IMAGING WITH SCORE-BASED GENERATIVE MODELS
GANs
- Bayesian Inference with Generative Adversarial Network Priors
- Deep Generative Adversarial Neural Networks for Compressive Sensing MRI
- GAN-based Projector for Faster Recovery with Convergence Guarantees in Linear Inverse Problems
- Solving Linear Inverse Problems Using GAN Priors: An Algorithm with Provable Guarantees
- Stochastic Seismic Waveform Inversion using Generative Adversarial Networks as a Geological Prior
Compressed Sensing
- Learning-Based Compressive Subsampling
- Learning-Based Compressive MRI
- NETT Regularization for Compressed Sensing Photoacoustic Tomography
Misc
- Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
- Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model
- Transport map accelerated Markov chain Monte Carlo
- A transport-based multifidelity preconditioner for Markov chain Monte Carlo
- Uncertainty Quantification with Generative Models
- Variational Inference for Computational Imaging Inverse Problems
- Adversarial Uncertainty Quantification in Physics-Informed Neural Networks
General information generative neural networks
Slides on generative models
Ingredients of convolutional neural networks
-
Lectures (Part 1, Part 2, Part 3) on Convolutional Neural Networks by Sebastian Raschka
-
Visual guide to convolution arithmetic in the context of deep learning, and associated paper
Tutorials and codes
-
Super simple implementations of generative models in PyTorch and TensorFlow along with related blog posts
Useful other material
-
Introduction to Deep Learning and Generative Models course - Sebastian Raschka (Spring 2020)
-
Intro to Neural Networks and Machine Learning course - Roger Grosse
[assignment]: