Neural networks
This is an old page. Please see CANVAS page for the course that starts in the fall of 2023.
New: The lectures on August 31, September 2 & 3 are via zoom. You find the zoom link on the canvas page. Let's try to be in zoom 5 minutes before the start of the lecture.
This course introduces the use of neural networks in machine learning: deep learning, recurrent networks, reinforcement learning, and other supervised and unsupervised machine-learning algorithms. I introduce these techniques, explain that they rest on the same fundamental principles, and I analyse their performance. Guest lectures highlight applications in industry (7.5 credit units).
This course is taught partly online with lectures, online forum, and the OpenTA system for graded home works.
Forum. Use the online forum to ask and discuss any questions you may have regarding the course this fall. Please register for the forum here. Use
OpenTA. We use the online system OpenTA for exercises, homework, and exam preparation. Go to Canvas to register. Note: GU Canvas for GU students, Chalmers Canvas for Chalmers students. Problems registering or logging in? Please contact Bernhard. Direct link to OpenTA.
Summary. Deep learning and neural networks.
Teachers
Bernhard Mehlig (lectures, examiner)
Aykut Argun (TA)
Anshuman Dubey (TA)
Johan Fries (TA)
Martin Selin (TA)
Navid Mousavi (TA)
Quiz
Test your linear algebra and analysis with the quiz on the OpenTA site.
Course representatives
N.N.
Schedule
See TimeEdit for current schedule.
Contents
1. Introduction.
Part I Hopfield models.
2. Deterministic Hopfield networks.
3. Stochastic Hopfield networks.
4. Stochastic optimisation.
Part II Supervised learning
5. Perceptrons.
6. Backpropagation.
7. Deep learning.
8. Recurrent networks.
Part III Unsupervised learning
9. Unsupervised Hebbian learning.
10. Radial basis functions.
11. Reinforcement learning.
Literature
B. Mehlig, Machine learning with neural networks, Cambridge University Press (2021)
Further literature
J. Hertz, A. Krogh & J. Palmer, Introduction to the theory of neural computation, Addison-Wesley.
S. Haykin, Neural Networks: a comprehensive foundation, Prentice Hall, New Jersey.
I. Goodfellow, Y. Bengio & A. Courville, Deep Learning, MIT Press.
Examination
The form for the re-exam for January 2021 is a supervised distance exam via zoom.
Credits for this course are obtained by solving the homework problems (solutions of examples and programming projects) and by a written examination. There are three sets of homework problems. Each of the three gives at most 4 points. The exam gives at most 12 points, resulting in a maximum of 24 points.
To pass it is necessary to obtain at least 5 points in the written exam, and to have at least 14 points in total.
Passing grades:
Chalmers: 3: >13.5p; 4: >17p, 5: >21.5p
GU: G: >13.5p; VG: >19.5p
ECTS: C: >13.5p; B: >17p; A: >21.5p
OpenTA
This course uses the OpenTA online system developed by Stellan Östlund and Hampus Linander for exercises, homework, and exam preparation.
Rules for homework problems
Same rules as for written exams apply: it is not allowed to copy any material from anywhere unless reference is given. All students must write their own computer programs and submit their own solutions. All submissions are handled via OpenTA. For all homework sets, your program code must be uploaded to OpenTA.
Keep a backup of your solutions to the OpenTA questions: your submitted PDF files as well as the answers you typed in. The system does not store your answers after December 2020. If you take a re-exam in January or August 2021 you will be asked to re-submit all answers and OpenTA scores.
Your OpenTA points are valid for the two re-exams in January and August 2021. Please contact any of the teachers if you need guidance for your exam preparation, or if you have questions about the coming re-exams. To pass the course in future academic years (for instance 2021/2022) you need to redo the OpenTA problems for that academic year.
URKUND. For each homework, concatenate the PDF files you uploaded to OpenTA, and send the PDF file electronically to
For Homework 3 you will submit your solutions as PDF files to OpenTA. The format of the solutions must be as follows. There are four questions giving 1p each. Separately for each 1p-question you must submit at most one A4 page with 12pt single-spaced text, and with 2cm margins. LateX template. Each page may contain at most one Figure and/or one Table with the corresponding Figure and/or Table caption, in addition to the text discussing the results shown in the Figure/Table. It is not necessary to write a full page for each problem, but you must explain/describe what you have done and clearly state your answers/results to the questions and your conclusions. When necessary you must discuss possible errors and inaccuracies in your results. If you are asked to plot results/make graphs, you do this in a Figure with legible axis labels and tic labels. All symbols and lines must be explained in the Figure or in a caption. The Figure may consist of separate panels. Refer to them as 'left panel', 'right panel', 'bottom panel', etc. (or alternatively label them 'a', 'b',...).
Deadlines are sharp. Late submissions are not accepted.
Homework 1. Hopfield model. Deadline TBA
Homework 2. Backpropagation. Deadline TBA
Homework 3. Deep-learning/reinforcement-learning project. Deadline TBA
Written exam
The exam covers the material in the most recent version of the lecture notes as well as in the homework problems. Old exam questions are given in the lecture notes (link on CANVAS).
Date for written exam, deadline for registration for exam. Please see this link. Course code FFR135.
If date & time of the exam collide with another exam you must take, then you must follow the steps outlined here.
If you don't pass the exam
Your OpenTA points are valid for the two re-exams in January and August 2020. Please contact any of the teachers if you need guidance for your exam preparation, or if you have questions about the coming re-exams. To pass the course in future academic years (for instance 2020/2021) you need to redo the OpenTA problems for that academic year.