Midterm 1 (2025)
Assignment Rules and Execution
The first midterm covers the basic pattern recognition techniques introduced up to Lecture 4. To pass the midterm you should
- perform one (only one) of the assignments described in the following;
- report your results as a Colab notebook or a 10 slide presentation (both versions are equally fine) and upload it here by the (strict) deadline;
You can use library functions to perform the analysis (e.g. to do the DFT, to perform ncut clustering, etc.) unless explicitly indicated. You can use whatever programming language you like, but I strongly suggest to use either Python or Matlab for which you have coding examples. Python will further allow you to deliver your midterm as a Colab notebook, but this is not a requirement (you can deliver a presentation instead).
Your report (irrespectively of whether it is a notebook or a presentation) needs to cover at least the following aspects (different assignments might have different additional requirements):
- A title with the assignment number and your name
- The full code to run your analysis (for Colabs) or a few slides (for presentations) with code snippets highlighting the key aspects of your code
- A section reporting results of the analysis and your brief comments on it
- A final section with your personal considerations (fun things, weak aspects, possible ways to enhance the analysis, etc.).
Do not waste time and space to describe the dataset or the assignment you are solving as we will all be already informed on it.
I do not espect state of the art performance: I am more interested in seeing that you have applied the technique correctly and that you can interpret the results, so do not invest heavy time in superoptimizing your results.
List of Midterm Assignments
Signal processing assignments
Assignment 1
Consider the following dataset: https://archive.ics.uci.edu/ml/datasets/Appliances+energy+prediction#
As you know, auto-regressive models assume weak stationarity. What happens if this assumption does not hold? To study the effect of non-stationarity, we will add a linear trend to the “Appliances” column of the dataset, which measures the energy consumption of appliances across a period of 4.5 months.
- First, preprocess the dataset to remove any trend (if necessary)
- Perform an autoregressive analysis on the clean time series
- Define a linear trend (i.e. a linear increase in time of the timeseries observations) and add it to the time series
- Perform the autoregressive analysis on the new time series
Show the results of the analysis with and without the linear trend, discussing your design choices and the results.
To perform the autoregressive analysis, fit an autoregressive model on the first 3 months of data and estimate performance on the remaining 1.5 months. Remember to update the autoregressive model as you progress through the 1.5 testing months. For instance, if you have trained the model until time T, use it to predict at time T+1. Then to predict at time T+2 retrain the model using data until time T+1. And so on. You might also try and experimenting with less "computationally heavy" retraining schedule (e.g. retrain only "when necessary"). You can use the autoregressive model of your choice (AR, ARMA, ...).
Hint: in Python, use the ARIMA class of the statsmodels library (set order=(3,0,0) for an AR of order 3); in Matlab you can use the ar function to fit the model and the forecast function to test.
Assignment 2
Consider the following dataset: https://www.kaggle.com/datasets/imsparsh/single-chestmounted-accelerometer
It contains accellerometer timeseries for 15 participants performing 7 different physical activities.
For this assignement focus on a single participant (of your choice) and study its 7 different activities using the Continous Wavelet Decomposition (CWD) approach discussed during the lectures. It suffices to compare the CWD visually and report you considerations on similarities and differences between the activities.
Note: you need to run the CDW separately for each accelerator channel. You can choose the wavelet family you prefer more (at random is also ok). In Python you may want to use Pywavelets or in Matlab the bundled Wavelet Toolbox (or you can use any other language and library that makes sense to you, but for this assignment I strongly advise Matlab).
Assignment 3
Image processing assignments
All the image processing assignments require to use the following dataset:
www.kaggle.com/datasets/ztaihong/weizmann-horse-database/data
The dataset contains in the horse directory 327 pictures of horses and in the mask directory where each image correspond to one image in the horse directory, and reports the corresponding manual segmentation of the horse.
Assignment 4
Implement the convolution of a set of edge detection filters with an image and apply it to at three images of your choice from the dataset. Implement Roberts, Prewitt and Sobel filters (see here, Section 5.2, for a reference) and compare the results (it is sufficient to do it visually). You should not use the library functions for performing the convolution or to generate the Sobel filter. Implement your own and show the code!
Assignment 5
Extract the SIFT descriptors using the visual feature detector embedded in SIFT from the horse pictures to identify the points of interest. Aggregate all the identified descriptors in a dataset and run k-means (or any clustering algorithm of your choice) on such data to partition the descriptors in clusters. Then analyze the obtained clusters by confronting the descriptors assigned to each cluster with the area of the semantic segmentation they end-up (in other words, compute a confusion matrix between the clusters and the 2 segmentation classes, horse and background). Discuss your findings. Choice of the number of clusters and of the clustering algorithm is on you (and should be discussed in the report).
Assignment 6
Implement the convolution of a Laplacian of a Gaussian blob (LoG) detector with an image and apply it to 3-4 images of your choice from the dataset. Do not use library functions for implementing the convolution or to generate the LoG filter. Implement your own and show the code (the interesting bits at least)! The function you implement should be able to run the LoG for different choices of the scale parameter, which is passed as an input argument. Show the results of your code on the 3-4 example images, for different choices of the scale parameter (sigma).