These are the lecture notes for the module Neurocomputing taught by Dr. Julien Vitay at the Technische Universität Chemnitz, Faculty of Computer Science, Professorship for Artificial Intelligence.
Each section/lecture is accompanied by a set of videos, the slides and some lecture notes which summarize the most important points to understand. Some sections are optional in the sense that no questions will be asked at the exam, but those interested in becoming neural network experts should feel free to study them. The videos are integrated in the lecture notes, but you can also access the complete playlist on Youtube.
Exercises are provided in the form of Jupyter notebooks, allowing to implement in Python at your own pace the algorithms seen in the lectures and learn to use machine learning libraries such as
tensorflow. A notebook to work on (locally or on Colab) and the solution are downloadable at the top-right of each page. A video explaining the exercise and one commenting the solution are available, with the playlist being on Youtube.
[Haykin, 2009] Simon S. Haykin. Neural Networks and Learning Machines, 3rd Edition. Pearson, 2009. http://dai.fmph.uniba.sk/courses/NN/haykin.neural-networks.3ed.2009.pdf.
[Chollet, 2017a] François Chollet. Deep Learning with Python. Manning publications, 2017. https://www.manning.com/books/deep-learning-with-python.
[Gerstner et al., 2014] Wulfram Gerstner, Werner Kistler, Richard Naud, and Liam Paninski. Neuronal Dynamics - a Neuroscience Textbook. Cambridge University Press., 2014. https://neuronaldynamics.epfl.ch/index.html.
- 1. Introduction to Python
- 2. Numpy and Matplotlib
- 3. Linear regression
- 4. Multiple Linear Regression
- 5. Cross-validation
- 6. Linear classification
- 7. Softmax classifier
- 8. Multi-layer perceptron
- 9. MNIST classification using keras
- 10. Convolutional neural networks
- 11. Transfer learning
- 12. Variational autoencoder
- 13. Recurrent neural networks