Overview: a Sparse Tour of Signal Processing


Signal and Image Processing with Orthogonal Decompositions

  • Continuous signals and images
  • Orthogonal representations
  • decomposition

    • energy conservation (Parseval)
    • Fourier and wavelets
    • 2D wavelet transform
  • Linear and nonlinear approximations

    • filtering
    • thresholding
    • compressibility and its relation to smoothness
  • Denoising and inverse problems

    • signal model
    • recovery by sparsity promotion

Reading material

Chapter 1 from Advanced Signal, Image and Surface Processing

Slides

Signal and Image Decomposition in Orthogonal Bases adapted from here.

Recordings


Fourier Processing

  • Continuous/discrete Fourier basis
  • Sampling
  • 2D Fourier transform
  • Fourier Approximation

Reading material

Chapter 2 from Advanced Signal, Image and Surface Processing. If you want more detail read Fourier Transforms from Mathematical Foundations of Data Sciences.

Assignments

See assignment tab.

Slides

Fourier Processing adapted from here.


Wavelet Processing

  • 1D Multiresolutions
  • detail spaces
    • Haar wavelets
  • 1D Wavelet transform
    • computing wavelet coefficients
    • discrete wavelet coefficients
    • fast wavelet transform -inverse transform
  • Filter banks
    • approximation filter
    • detail filter
    • vanishing moments
    • Daubechies wavelets
  • Extension to 2D -anisotropic wavelets
    • 2D multiresolutions
    • 2D wavelet bases
    • 2D discrete wavelet coefficients
    • fast 2D wavelet transform

Reading material

Chapter 3 from Advanced Signal, Image and Surface Processing. If you want more detail read Wavelets from Mathematical Foundations of Data Sciences.

Assignments

See assignment tab.

Slides

Wavelet Processing adapted from here.


Approximation with Orthogonal Decompositions

  • Linear and nonlinear approximations
    • Sparse approximation in a basis
    • hard thresholding
    • decay of approximation errors (linear vs. nonlinear)
    • Relation between decay rate coefficients and approximation error
    • Fourier for smooth functions
    • Fourier and singularities/edges
  • wavelet transform for peice-wise 1D smooth functions
    • vanishing moments
    • magnitude of wavelet coefficients and its relation to edges
    • decay and nonlinear approximation error for piece-wise 1D functions
  • Piece-wise smooth 2D functions
    • 2D approximations -decay and nonlinear approximation error for piece-wise 2D functions
  • Geometrically regular functions
    • space of BV functions
    • finite elements
    • curvelets, their construction, and properties

Reading material

Chapter 4 from Advanced Signal, Image and Surface Processing. If you want more detail read Linear and Non-linear Approximation from Mathematical Foundations of Data Sciences.

Assignments

See assignment tab.

Slides

Approximation and Coding with Orthogonal Decompositions adapted from here.


Linear and Nonlinear Denoising

  • Linear denoising
  • additative noise model
    • linear denoising by smoothing
    • Wiener filter and oracle estimation of optimal filter
  • Nonlinear denoising
    • hard thresholding
    • optimal threshold selection
    • nonlinear approximation and estimation
    • hard vs. soft thresholding
  • Translational invariant thresholding
    • translation invariant wavelets
    • optimal threshold
  • Other diagonal estimators
    • between hard and soft thresholding
  • Non-diagonal estimators
    • block thresholding

Reading material

Chapter 5 from Advanced Signal, Image and Surface Processing. If you want more detail read Denoising from Mathematical Foundations of Data Sciences.

Assignments

See assignment tab.

Slides

Linear and nonlinear denoising adapted from here.


Variational Regularization of Inverse Problems

  • Variational priors
  • smooth and cartoon
    • natural image priors
    • discretization
  • Variational regularization
    • regularized inverse
    • pseudo inverse
    • Sobolev regularization and inpainting
    • total variation regularization and inpainting
  • Example from tomography with the Radon transform

Reading material

Chapter 6 from Advanced Signal, Image and Surface Processing. If you want more detail read Variational Priors and Regularization from Mathematical Foundations of Data Sciences.

Assignments

See assignment tab.

Slides

Inverse problems Regularization adapted from here.


Sparse Regularization of Inverse Problems

  • Linear inverse problems
    • denoising
    • inpainting
    • super- resolution
  • Regularization of inverse problems
    • regularized inverse
    • Lagrangian formulation including the the Lagrange multiplier/trade-off parameter lambda
    • smooth and cartoon priors
  • Redundant dictionaries
  • Sparse priors
    • convex relaxation of the \(\ell_0-norm\) via the \(\ell_1\)-norm
    • \(\ell_1\)-regularization and sparse recovery
    • noise-free sparsity-promoting regularization
  • Iterative soft thresholding

Reading material

Chapter 7 from Advanced Signal, Image and Surface Processing. If you want more detail read Inverse Problems and Sparse Regularization from Mathematical Foundations of Data Sciences.

Assignments See assignment tab.

Slides

Sparse regularization of Inverse Problems adapted from here.


Compressive Sensing

  • Classical sampling
  • discretization
    • point-wise sampling and smoothness
  • Compressive acquisition
    • examples single pixel camera
    • physical model
  • Inversion and sparsity
    • \(\ell_1\) prior
    • sparse CS recovery
  • Theoretical guarantees
    • CS with restricted isometry constants (RIP)
    • singular-value distributions
    • RIP for Gaussian matrices
    • numerics with RIP
  • Fourier measurements
    • MRI imaging
    • radar interferometry
    • structured measurements

Assignments

See assignment tab.

Reading material

Compressive Sensing Chapter from Mathematical Foundations of Data Sciences.

Slides

Compressive Sensing adapted from here.


Bayesian Inference

Reading material

Slides

Slide deck 1

Statistical Regularization

The review article Solving inverse problems using data-driven models serves as the major reading material. For a mathematical description of the February 6 and 11 lectures, refer to pages 22 till 43 from this article.

Learning in Functional Analytic Regularization

During the lecture of February 13, we start by considering section 5.1.2 Deep direct Bayes estimation on page 84 of Solving inverse problems using data-driven models untile Regularization by learning. We continue with section Deep direct estimation of higher-order moments on page 99. Next, we consider loop unrolling described in section 4.9 Data-driven optimization on page 66 until section 4.10 and later in section 5.1.4 Learned iterative schemes on page 89 until section 5.1.5. The supervised training, learned prior/regularizer, unsupervised learning, and semi-supervised learning are briefly introduced in section 5.1 Learning an estimator on page 82.

Learning in Statistical Regularization

Slides

We use a combination of slides developed mainly by Ozan Oktem and Jonas Adler and others. The slides presented in class can be found here.


Papers

Untrained Neural Networks

Loop-Unrolled Neural Networks

Normalizing Flows

Diffusion Models (Equivalent to certain SDE models)

Stochastic Differential Equations Generative Models (Equivalent to certain Diffusion models)

GANs

Compressed Sensing

Misc


General information generative neural networks

Slides on generative models

Ingredients of convolutional neural networks

Tutorials and codes

Useful other material

[assignment]: