Machine learning with neural networks

coverThis textbook is based on lecture notes for a course on machine learning with neural networks for scientists and engineers that I have given at Gothenburg University and Chalmers Technical University in Gothenburg, Sweden.

The material is organised into three parts: Hopfield networks, supervised learning of labeled data, and learning algorithms for unlabeled data sets. Part I introduces stochastic recurrent networks: Hopfield networks and Boltzmann machines. The analysis of their learning rules sets the scene for the later parts. Part II describes supervised learning with multilayer perceptrons and convolutional neural networks. This part starts with a simple geometrical interpretation of the learning rule and leads to the recent successes of convolutional networks in object recognition, recurrent networks in language processing, and reservoir computers in time-series analysis. Part III explains what neural networks can learn about data that is not labeled. This part begins with a description of unsupervised learning techniques for clustering of data, non-linear projections, and embeddings. A section on autoencoders explains how to learn without labels using convolutional networks, and the last chapter is dedicated to reinforcement learning. The overall goal of the course is to explain the fundamental principles that allow neural networks to learn, emphasising ideas and concepts that are common to all three parts. Buy at bokus.

Errata September 22 (2023).

Through e-learning system OpenTA you can access, solve, and get immediate feedback for exercises from the book.

cover solutionsSolutions for all exercises in Machine learning with neural networks. An Introduction for scientists and engineers (Cambridge University Press, 2021). The cover image shows an input pattern designed to maximise the output of neurons corresponding to one feature map in a given convolution layer of a deep convolutional neural network.

September 22 (2023).