• Materials
  • Notebooks

Neurocomputing

Author
Affiliation

Julien Vitay

Chemnitz University of Technology

Abstract
This website contains the materials for the module Neurocomputing, covering basics in machine learning, deep learning and neuro-AI.

Lectures

You will find below the links to the slides for each lecture (html and pdf).

1 - Introduction

Slides
1.1 - Introduction
Introduction to the main concepts of reinforcement learning and showcasing of the current applications.
html, pdf
1.2 - (optional) Basics in math
Mathematical background necessary to follow this course.
html, pdf
1.3 - Neurons
Quick journey from biological neurons to artificial neurons.
html, pdf

2 - Linear algorithms

Slides
2.1 - Optimization
Overview of gradient descent and regularization.
html, pdf
2.2 - Linear regression
Linear regression, multiple linear regression, logistic regression, polynomial regression and how to evaluate them.
html, pdf
2.3 - Linear classification
Hard linear classification, Maximum Likelihood Estimation, Soft linear classication, multi-class softmax classification.
html, pdf
2.4 - Learning theory
Vapnik-Chervonenkis dimension, Cover’s theorem, feature spaces and the kernel methods..
html, pdf

3 - Deep learning

Slides
3.1 - Feedforward neural networks
Basic neural network aka Multi-layer perceptrons (MLP), and the almighty backpropagation algorithm.
html, pdf
3.2 - Modern neural networks
Advanced methods for training neural networks: optimizers, activation functions, normalization, etc.
html, pdf
3.3 - Convolutional neural networks
CNNs like AlexNet and its followers (VGG, ResNet, Inception) started the deep learning hype and revolutionized computer vision..
html, pdf
3.4 - Object detection
Object detection networks (R-CNN, YOLO, SSD) are able to locate objects in an image.
html, pdf
3.5 - Segmentation network
Segmentation networks (U-Net) can tell which pixels belong to an object.
html, pdf
3.6 - Autoencoders
Autoencoders and variational autoencoders (VAE) can be used to extract latent representations from raw data.
html, pdf
3.7 - Restricted Boltzmann machines
RBMs are generative stochastic neural networks that can learn the distribution of their inputs.
html, pdf
3.8 - Generative Adversarial Networks
GANs are generative networks able to generate images from pure noise.
html, pdf
3.9 - Recurrent neural networks
RNNs, especially LSTMs, were long the weapon of choice to process temporal sequences (text, video, etc)..
html, pdf

4 - Generative AI

Slides
4.1 - Transformers
The Transformer architecture of (Vaswani, 2017) used self-attention to replace RNNs and start the second wave of AI hype.
html, pdf
4.2 - Contrastive learning
Contrastive learning is a form of self-supervised allowing to learn context-relevant representations from raw data..
html, pdf
4.3 - Vision Transformer
Vision transformers use the Transformer architecture to be the new state of the art in computer vision.
html, pdf
4.4 - Diffusion models
Diffusion models are a novel probabilistic architecture allowing to learn to generate images (Midjourney, Dall-E, etc) through incremental denoising.
html, pdf

5 - Neuro-AI

Slides
5.1 - Limits of deep learning
This lecture (provocatively) explains why deep learning-based approaches will never be able to achieve Artificial General Intelligence and why more brain-inspired approaches (neuro-AI) are the next step for AI..
html, pdf
5.2 - Hopfield networks
Hopfield network allow to implement associative memory, a fundamental aspect of cognition..
html, pdf
5.3 - Reservoir Computing
Reservoir Computing (RC) is a paradigm allowing to train recurrent neural networks on time series with much less compuations than with deep learning approaches.
html, pdf
5.4 - Spiking networks
Spiking networks, in addition to being closer to brain functioning, allow to perform the same computations as deep netowkrs without requiring as much communication, allowing energy-efficient implementations on neuro-morphic hardware.
html, pdf
5.5 - Beyond deep learning
To conclude, we will see some of the requirement of genetral intelligence that need to be added to our models.
html, pdf

Exercises

You will find below links to download the notebooks for the exercises (which you have to fill) and their solution (which you can look at after you have finished the exercise). It is recommended not to look at the solution while doing the exercise unless you are lost. Alternatively, you can run the notebooks directly on Colab (https://colab.research.google.com/) if you have a Google account.

For instructions on how to install a Python distribution on your computer, check this page.

Notebook Solution
1 - Introduction to Python
Introduction to the Python programming language. Optional for students already knowing Python.
ipynb, colab ipynb, colab
2 - Numpy and Matplotlib
Presentation of the numpy library for numerical computations and matplotlib for visualization. Also optional for students already familiar.
ipynb, colab ipynb, colab
3 - Linear regression
Implementation of the basic linear regression algorithm in Python and scikit-learn.
ipynb, colab ipynb, colab
4 - Multiple Linear regression
MLR on the California Housing dataset using scikit-learn.
ipynb, colab ipynb, colab
5 - Cross-validation
Different approaches to cross-validation using scikit-learn.
ipynb, colab ipynb, colab
6 - Linear classification
Hard and soft linear classification.
ipynb, colab ipynb, colab
7 - Softmax classifier
Softmax classifier for multi-class classification.
ipynb, colab ipynb, colab
8 - Multi-layer perceptron
Basic implementation in Python+Numpy of the multi-layer perceptron and the backpropagation algorithm.
ipynb, colab ipynb, colab
9 - MNIST classification using keras
Keras tutorial applied to classifying the MNIST dataset with a MLP.
ipynb, colab ipynb, colab
10 - Convolutional neural networks
Implementation of a CNN in keras for MNIST.
ipynb, colab ipynb, colab
11 - Transfer learning
Leveraging data augmentation and/or pre-trained CNNs (Xception) for learning a small cats vs. dogs dataset.
ipynb, colab ipynb, colab
12 - Variational autoencoders
Implementing a VAE in keras.
ipynb, colab ipynb, colab
13 - Recurrent neural networks
Sentiment analysis and time series prediction using LSTM layers.
ipynb, colab ipynb, colab

Recommended readings

  • Kevin Murphy. Probabilistic Machine Learning: An introduction. MIT Press, 2022. https://probml.github.io/pml-book/book1.html

  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning. MIT Press, 2016. http://www.deeplearningbook.org.

  • François Chollet. Deep Learning with Python. Manning publications, 2017. https://www.manning.com/books/deep-learning-with-python.

  • Simon S. Haykin. Neural Networks and Learning Machines, 3rd Edition. Pearson, 2009. http://dai.fmph.uniba.sk/courses/NN/haykin.neural-networks.3ed.2009.pdf.

 

Copyright Julien Vitay - julien.vitay@informatik.tu-chemnitz.de