October 8, 14:00-15:30,


                   Lecture 1. High Dynamic Range Imaging – Kari Pulli (NOKIA) (download PPT slides)

The eye can adapt to a dynamic range of light intensity from 10^-6 to 10^+8 cd/m^2.

Cameras have much less dynamic range, typically they can only handle intensity ratios of 1:1000 or at most 1:10000. Typical display devices, such as printed paper, can only handle much lower ratios than 1:100. This lecture covers how human visual system deals with high dynamic range, multi-exposure photography for capturing high-dynamic range images, various possibilities to display images, and tone mapping that allows HDR images to be displayed on low-dynamic range displays so that people could still experience the original view as faithfully as possible. Material: Lecture notes, Links to papers, code, tutorials, etc. on the web


            Project 1.1.


October 13, 9:45-11:15,


                   Lecture 2. Imaging pipe description: From sensor data to an imageOssi Kalevo (NOKIA) (download PDF slides)


This lecture covers end-to-end imaging system. The aim of the lecture is to show what kind of different image processing algorithms and methods are needed when camera captured images are used within different use cases to provide excellent user experience and high quality end results. Algorithm details are not described, because they are discussed more details within other lectures. Simple visual example images are shown from most of the methods. Addition to the individual algorithm examples there is also shown how the methods can be connected together and what kinds of systems are needed for the full end-to-end imaging solution. Also couples of different architectural solutions and block diagrams are shown to improve the understanding of the imaging systems and the used image formats. Lecture’s materials: Lecture notes.


1.      Introduction to camera system;

2.      Optics related image processing;

3.      Sensor related image processing;

4.      Display related image processing;

5.      General image processing;

6.      Image enhancements and features;

7.      Imaging architectures.


October 13, 12:15-13:45,


                   Lecture 3. Control of Exposure and Colour for Digital Cameras – Jarno Nikkanen (NOKIA)

(download PDF slides, part 1) (download PDF slides, part 2)


The main topics for the lecture are Automatic Exposure Control (AEC) and Automatic White Balancing (AWB) algorithms for consumer level digital cameras such as mobile camera phones. Additional pre- and post-processing algorithms that are important in order to achieve desired exposure and colours are discussed as well. These include linearization, vignetting elimination, colour conversion, contrast and brightness enhancement and colour appearance modeling algorithms.


1.      Overview of the system from colour and exposure point of view

2.      Linearization

3.      Vignetting elimination

4.      Automatic Exposure Control

5.      Automatic White Balancing

6.      Brightness and contrast enhancement

7.      Colour conversion

8.      Colour Appearance Modeling


Projects 3.1, 3.2, 3.3, 3.4. 

Extra material for Projects 3.1,3.2,3.3.

Extra material for Project 3.4


October 20, 9:45-11:15,


                   Lecture 4. Noise: modeling, estimation and transformations - Alessandro Foi (TUT/SGN) (download PDF slides)


The lecture(s) cover the main theoretical and practical aspects related to the modeling and estimation of noise in digital images.  Generic one-parameter families of distributions are used as the essential mathematical setting for the observed signals. A number of practical methods for noise estimation suitable for the most relevant noise models are presented. Further, nonlinear homomorphic transformations of noisy data are introduced and demonstrated as important preprocessing steps to facilitate further operations and especially filtering. Great emphasis is given to the modeling, estimation, and transformation of raw-data noise. In particular, a complete denoising chain for raw-data from an unknown sensor is illustrated in detail. Matlab software is provided as a companion to each topic.


1.      Modeling.   Independence, whiteness, colored noise, homoskedasticity vs. heteroskedasticity, signal-dependent noise models, standard-deviation curve, Poisson noise, quantization, saturation and clipping. Noise models for raw-data.

2.      Estimation.   Basic estimators: sample standard-deviation, mean and median of absolute deviation, unbiasedness and asymptotic unbiasedness of finite-sample estimators. Estimation in transform domain. Noise estimation from multiple images or from single image: homoskedastic noise, signal-dependent noise. Noise estimation for raw-data.

3.      Transformations.   Normalization and variance stabilization: generic stabilization, Anscombe transformation, debiasing, optimal non-parametric stabilizers. Application to raw-data denoising.


Projects 4.1, 4.2, 4.3.


October 20, 12:15-13:45,


                   Lecture 5. Sampling and interpolation on uniform and non-uniform grids - Atanas Gotchev (TUT/SGN) (download PDF slides)


The lecture will address the sampling as a problem of projection on respective function spaces, with splines and trigonometric polynomials being the most notable examples. Then, reconstruction is cast as a frame problem solvable by LS-based approaches. Both sampling on uniform and non-uniform grids will be addressed. Recent achievements of so-called 'sampling with finite rate of innovation' will be reviewed and related with the previously-described framework. After reviewing the basic theory practical recipes about how to perform high-quality and efficient resampling (interpolation) in different cases will be presented.


1.      Basics of sampling of functions of continuous variable. Sampling theorems. Sampling regarded as projection on a certain function space. Reconstruction from samples (interpolation).

2.      Non-ideal sampling and reconstruction. Sampling and reconstruction kernels. Strang-Fix theory and approximation order. Sampling with finite rate of innovation.

3.      Sampling of multi-dimensional signals. Multi-dimensional sampling grids and non-separable kernels.

4.      Non-uniform sampling and interpolation. Sampling theorems. Relations with Frame theory.

5.      Practical schemes for sampling and interpolation. Sampling and interpolation artefacts. Design of efficient interpolation kernels.


Projects 5.1, 5.2.


October 27, 9:45-11:15,


                   Lecture 6. Modern Image Restoration. Part 1. Local techniques. – Vladimir Katkovnik and Karen Egiazarian (TUT/SGN) (download PDF slides)


This part of the course (two lectures) presents modern advance development for image restoration from noisy data. The lectures cover basic theoretical and practical aspects of the techniques. A wide scope of imaging problems is considered including denoising and deblurring Gaussian and non-Gaussian images. MATLAB software is provided for illustration of ideas and performance of the algorithms.

I (a) Local polynomial approximations

The concept of spatially adaptive local polynomial approximation (LPA) has been developed to deal with image intensity signals. This type of methods searches for a largest local neighborhood where LPA fits well to data. It is applied in a pixel-wise manner and defines a nonlinear varying scale (window size and shape) adaptive filter. This adaptation is based on a recent intersection of confidence interval (ICI) algorithm.

I (b) Transform domain redundant techniques

Full rank high-order models with a large number of the basis functions (typically non-polynomials) are exploited in redundant approximations. For the orthogonal basis function this modeling is treated as the transform domain representation with filtering produced in the spectrum (transform) domain. The data are processed by overlapping subsets, i.e. windows or blocks, and multiple estimates are obtained for each pixel. Typically, estimation is composed from three successive steps: first data windowing (blocking); second, window-wise processing; and, third, calculation of the final estimate by aggregating (fusing) the multiple window-wise estimates. It is found, that this sort of redundant approximations with multiple estimates for each pixel essentially improves performance and results in the best state-of-the-art algorithms.

The shape-adaptive DCT algorithms are considered in detail.


   1) LPA smoothing and differentiating filter design, window function and estimate scale, accuracy, frequency domain implementation;

   2) Adaptive window size selection: ICI algorithm, complexity and implementation;

   3) Anisotropic directional LPA: directional image processing, directional LPA, space and frequency domains of the directional LPA filters.

4) Denoising and deblurring algorithms.

5) DCT shape-adaptive algorithms.


Projects 6.1, 6.2, 6.3, 6.4,6.5.


October 27, 12:15-13:45,


                   Lecture 7. Image Quality – Tero Vuori (NOKIA) (download PDF slides) (download related paper)


The lecture is an overview to image quality: what it is, and how to measure it. In the beginning, it is important to understand that there is no single unique measure or definition of image quality due to its subjective nature and the complexity of the visual process. The subjective impression of goodness the image conveys between human observers. Technically, image quality is a global resultant of the whole imaging system: from scene, optics, sensor, coding, transmission, decoding, display, human eye, and to brain. So, it is technology but also human perception. Image quality is usually divided into subjective and objective image quality. By subjective image quality, we refer to the visual image quality. That is what people see and prefer. Technology is made for humans, so it is important to please human eye. By objective image quality, we refer to the measurement techniques which use some algorithm to evaluate image quality. Algorithms try to see the image as human eye as much as possible. Usually, objecgive metrics use some target image, and then compare the shooted image to the known target.


-       Image quality concept

-       Subjective image quality

-       Objective image quality


-       Nyman, J. Radun, T. Leisti, and T. Vuori (2005). From Image Fidelity to Subjective Quality: A Hybrid Qualitative/Quantitative Methodology for Measuring Subjective Image Quality for Different Image Contents. Proc. 12th International Display Workshops (IDW 2005), Takamatsu, Japan.




November 3, 9:45-11:15,


                   Lecture 8. Imaging sensor technology – Juha Alakarhu (NOKIA) (download PDF slides)


- Illumination, photons

- Pixel technology: optical stack, electrical structure

- Color filtering

- Main sensor performance parameters

- System low light performance


November 3, 12:15-13:45,


                   Lecture 9. Color filter array interpolation – Dmitriy Paliy (NOKIA, Helsinki) (download PDF slides)


Simultaneous denoising and interpolation of noisy Bayer-patterned data is considered. The aim is to reconstruct full-resolution RGB image. The developed technique is specifically targeted to filter signal-dependant, e.g. Poissonian, or heteroscedastic noise, and effectively exploits the correlation between the color channels. The joint technique for denoising and interpolation is based on the concept of local polynomial approximation (LPA) and intersection of confidence intervals (ICI). The filters utilize spatial information from the green, red, and blue color channels. This is done by a linear combination of zero-order smoothing (for given color) and first-order derivative kernels (for missing color) designed for the subsampled data grid. With these filters, the denoised and the interpolated estimates are obtained by convolutions over the Bayer data. The ICI rule is used for data-adaptive selection of the length of the designed cross-color directional filter. Fusing estimates from multiple directions provides the denoised and interpolated values. The full-size RGB image is obtained by placing these values into the corresponding positions in the image grid. The efficiency of the proposed approach is demonstrated by experimental results with simulated and real camera data.


1) Image formation model: Bayer’s sampling, random noise;

2) Directional LPA filters and interpolators;

3) ICI adaptive window size denoising and interpolation for Poissonian Bayer data;

4) Algorithm’s organization.

Lecture’s materials:

(1) V. Katkovnik, K. Egiazarian, J. Astola, Local Approximation Techniques in Signal and Image Processing (SPIE PRESS, Bellingham, Washington, 2006).

(2) Lecture notes;

(3) Papers;

(4) Matlab software implementing presented algorithms.


Projects 9.1, 9.2.



November 3, 13:45-14:15, Project works assignment



November 10, 9:45-11:15,


                   Lecture 10. Modern Image Restoration. Part 2. Nonlocal techniques. – Vladimir Katkovnik and Karen Egiazarian (TUT/SGN) (download PDF slides)


Usually in image reconstruction the algorithm uses observations in a neighborhood of the pixel of the interest. It is a basic point of what we call the local technique. In the nonlocal techniques, the algorithm analyses the data “in large” and collects the observations from the whole image looking for similar features typical for real life images.  The most advanced modern techniques implement this sort of nonlocal processing. The evolution of the non-local techniques from the sample means to the transform domain processing is considered.  The latter algorithms are redundant, the data are processed by overlapping blocks, and multiple estimates obtained for each pixel are fused (aggregated) into the final image estimates.

Block-Matching with 3D Filtering (BM3D) is considered in detail.


1) From local to nonlocal approximations;

2) Nonlocal means, block matching filtering;

3) Block matching filtering with collaborative filtering.

4) Performance of the algorithms for denoising, deblurring, Gaussian and non-Gaussian data.

Lecture’s materials:

(1) V. Katkovnik, K. Egiazarian, J. Astola, Local Approximation Techniques in Signal and Image Processing (SPIE PRESS, Bellingham, Washington, 2006).

(2) Lecture notes;

(3) Papers;

(4) Matlab software implementing presented algorithms.


Projects 10.1, 10.2.


November 10, 12:15-13:45,


                   Lecture 11. 3D image capture and display technology - Atanas Gotchev (TUT/SGN) (download PDF slides)


The lecture will review the state-of-the-art of methods, technologies and standards in 3D moving scenes capture, representation and display. It will emphasize the challenging problems from a signal and image point of view.


1.      Why 3D? 3D video representation formats: stereo video, multi-view video, dense depth maps. Rendering. Inter-view interpolation.

2.      Capture of 3D scenes. Stereo and multi-view capture. Multi-sensor capture of video and depth. Dense depth map estimation. Depth from stereo. Structure from motion.

3.      3D display technologies. Stereo with glasses. Stereoscopic displays. Auto-stereoscopic display. Display-specific artifacts.

4.      Image processing for auto-stereoscopic displays. Anti-aliasing filtering. Cross-talk mitigation


November 17, 9:45-11:15,


                   Lecture 12.  Automatic focusing system in mobile imaging devices. - Vitali Samurov & Evgeny Krestyannikov (NOKIA) (download PDF slides)


To address a wide spectrum of customer needs, the mobile phones are rapidly converging into multipurpose devices that incorporate multiple different products. 

Digital imaging cameras nowadays are often seen as an integrated part of any mobile phone. Currently, the imaging phones are less functional than

their component parts, that is digital still cameras. To fill in this gap, cameraphones should offer many sophisticated features and technological know-how to meet the customers' expectations and

Ensure the high quality of photo images.

Autofocus (AF) is one of the most important functions to be included to the camera system. It allows the user to obtain a correct focus on the subject of particular interest (i.e. “sharp” image) without manual adjusting of the lens position.

The lecture gives a brief introduction to automatic focusing algorithms used in modern mobile phones and digital still cameras. The aim is to provide the audience the basic understanding of how the AF works within the camera module, and how all the key components fit together. The main emphasis of the lecture will be on the optical system, the image signal processor (ISP) gathering the needed statistics from the image, and the AF algorithms which ensure the choice of the correct focusing distance.


November 17, 12:15-13:45, Project discussions meeting