Skip to content

🔁Graphical models, Recurrent Neural Networks and SIFT algorithm for image processing, signal analysis and timeseries forecasting (MD Course: Intelligent Systems for Pattern Recognition)

Notifications You must be signed in to change notification settings

dilettagoglia/Signal-Processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ISPR course

(Intelligent Systems for Pattern Recognition, A.Y. 2019/20, University of Pisa)

Midterm 1 - Assignment 5 (march 2020)

Image processing with SIFT algorithm

  • Select one image from each of the eight thematic subsets (see previous assignment), for a total of 8 images.
  • Extract the SIFT descriptors for the 8 images using the visual feature detector embedded in SIFT to identify the points of interest.
  • Show the resulting points of interest overlapped on the image.
  • Then provide a confrontation between two SIFT descriptors showing completely different information (e.g. a SIFT descriptor from a face portion Vs a SIFT descriptor from a tree image).
  • The confrontation can be simply visual: for instance you can plot the two SIFT descriptors closeby as barplots (remember that SIFTs are histograms). But you are free to pick-up any reasonable means of confronting the descriptors (even quantitatively, if you whish).

Useful sources

Midterm 2 - Assignment 1 (april-may 2020)

Hidden Markov Models for regime detection

  • Fit an Hidden Markov Model with Gaussian emissions to the data in DSET1; it is sufficient to focus on the “Appliances” and “Lights” columns of the dataset which measure the energy consumption of appliances and lights, respectively, across a period of 4.5 months.
  • Consider the two columns in isolation, i.e. train two separate HMM, one for appliances and one for light.
  • Experiment with HMMs with a varying number of hidden states (e.g. at least 2, 3 and 4).
  • Once trained the HMMs, perform Viterbi on a reasonably sized subsequence (e.g. 1 month of data) and plot the timeseries data highlighting (e.g. with different colours) the hidden state assigned to each timepoint by the Viterbi algorithm.
  • Then, try sampling a sequence of at least 100 points from the trained HMMs and show it on a plot discussing similarities and differences w.r.t. the ground truth data.

Useful sources

Midterm 3 - Assignment 3 (may 2020)

Gated RNN for timeseries prediction

DATASET

Train a gated recurrent neural network (LSTM) to predict energy expenditure (“Appliances” column) using two approaches:

  • Predict the current energy expenditure given as input information the temperature (T_i) and humidity (RH_i) information from all the i sensors in the house.
  • Setup a one step-ahead predictor for energy expenditure, i.e. given the current energy consumption, predict its next value.

Show and compare performance of both methods.

Useful sources

Midterm 4 - final exam - (june 2020)

CNN for video processing

A review of the paper:

Zhaofan Qiu, Ting Yao, Tao Mei, Learning Spatio-Temporal Representation with Pseudo-3D Residual Networks, arxiv.org/abs/1711.10305, Microsoft Research, Beijing, China, 2017

Further informations

Grade obtained at the final exam: 30 cum laude

About

🔁Graphical models, Recurrent Neural Networks and SIFT algorithm for image processing, signal analysis and timeseries forecasting (MD Course: Intelligent Systems for Pattern Recognition)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages