Intensive course on neural networks (Göttingen)

This short course introduces the use of artificial neural networks in machine learning. The course is aimed at engineers and natural scientists. The focus is on supervised learning with multi-layer perceptron networks, because this method has recently become very popular in science and technology. I describe the network layout, how to train such networks with stochastic gradient descent, and describe recent developments in the field (deep learning). I conclude with a discussion of current questions and applications. This course is based on Chapters 5 to 10 of Machine learning with neural networks. I also offer homework problems that illustrate the learning goals. These will be made available in the online system OpenTA.

Registration

Note. Please register here before the first lecture. Email Bernhard Mehlig if you encounter problems.

Prerequisites

Basic linear algebra, analysis, and programming. Test your skills with a quiz at the OpenTA site.

Contents and schedule

Thu Jan 11  14:00- 15:45 Introduction and overview PDF
  16:15 - 18:00 Perceptrons Chapter 5
Fri Jan 12   14:00 - 15:45 Training deep networks Chapter 6
  16:15 - 18:00 Introduction to Exercises OpenTA
Thu Jan 18 14:00 - 15:45 Training deep networks Chapter 7
    Convolutional networks Chapter 8
  16:15 - 18:00  Unsupervised learning Chapter 10
Fri Jan 19                                          14:00 - 15:45 Recurrent networks Chapter 9
  16:15 - 18:00  Machine translation & transformers  

 

 

 

 

 

 

 

 

 

 

 

Literature

B. Mehlig, Machine learning with neural networks, Cambridge University Press (2021)

B. Mehlig, Exercise solutions for Machine learning with neural networks (2023)