DigitalTwin: AI and Machine Lerning

This is the course page for the Digital Twin course on Artificial Intelligence and Machine LearningSECOND EDITION


Lecture 1: 15/10/2020 @ 14:00 (Introduction)

In this first lecture I will discussion some of the early history of Artificial Intelligence, its trials and tribulations through the years, and the recent renaissance it is experiencing in the form of Deep Learning. I will also give a broad overview of the course topics that will be covered over the course of the ten lectures.

Finally, I will finish up the first lesson with a high-entropy crash course in Python programming. Since all examples, exercises and labs will use Python, this first laboratory session is highly recommended.

Resources


Lecture 2: 22/10/2020 @ 14:00 (Mathematical foundations)

In this lecture we will have a high-entropy introduction to the mathematical building blocks that will help us understand contemporary Machine Learning. We will see how Linear Algebra is fundamental as a modeling a computational tool, how statistics and probability gives us tools to analyze results and qualify output, and how multivariate calculus provides the machinery to estimate parameters of our models from data.

Resources


Lecture 3: 29/10/2020 @ 14:00 (Numerical programming)

In this lecture we will continue our study of numerical programming using Numpy, Matplotlib, Scikit-learn, and Pandas. I will start with an overview of the tools we will use for exploratory data analysis and machine learning. Then I will finish with a general, step-by-step recipe we will follow for most of our applied exercises.

Finally, we will have a laboratory session (which will likely bleed over into tomorrow) where will experiment and gain some experience with a simple linear regression problem.

Resources


Lecture 4: 05/11/2020 @ 14:00 (Supervised learning)

In the first part of todays class we will continue the laboratory we began yesterday. In the second half we will look at a variety of models for supervised regression and classification problems.

Resources


Lecture 5/6: 12-13/11/2020 @ 09:00 (Unsupervised learning)

In the first part of today’s lecture we will finish looking at some techniques for supervised learning, importantly concentrating on techniques that yield probability estimates rather than just “scores” for class membership. Then, we will turn our attention to techniques for unsupervised learning that will help us understand the structure of high-dimensional data and reduce dimensionality in ways that preserve certain characteristics of the original feature distributions.

Resources


Lecture 7: 19/11/2020 @ 14:00 (The bias/variance tradeoff)

Today’s lecture will bring to a close our treatment of basic data manipulation, visualization, and classical machine learning techniques. We will talk about the bias-variance decomposition and the important topic of model selection. We will see how cross-validation gives us a basic tool for robust model evaluation (often called model scoring) and how to embed cross-validation into a hyperparameter selection procedure.

Resources


Lecture 8: 20/11/2020 @ 14:00 (DL I: Keras and MLPs)

In this lecture we will finally see how to build deep and composable models. We will use the Keras/Tensorflow framework to construct Deep Multilayer Perceptron (MLP) networks for regression and classification. Follwing the lecture we will have a laboratory session where we will build our first deep models using the Keras/Tensorflow framework.

Resources


Lecture 9: 26/11/2020 @ 14:00 (DL II: Recurrent Networks)

In this lecture we will see how to build deep models capable of processing variable-length inputs. Recurrent Neural Networks are able to capture long-range dependencies between observations. I will give a very brief overview of recurrent models, especially the Long Short-Term Memory (LSTM) model which is used in a broad variety of applications today. In the laboratory session following the lecture we will experiment with sequential character generation models to demonstrate the power of deep recurrent models.

Resources


Lecture 10: 26/11/2020 @ 14:00 (CNNs)

In this lecture we will see the model everyone has been talking about for the last five years: the Convolutional Neural Network (CNN). We will see how they are at the same time different than MLPs, but also at the same time kind of the same (if you think about them in the right way). We will see how to build CNNs from the basic building blocks of Convolutions, Pooling, and Dense layers. In the laboratory session following the lecture we will experiment with building CNNs (and comparing their performance with MLPs) to classify images.

Resources


Lecture 10: 27/11/2020 @ 14:00 (Advanced CNNs)

Some extra material on advanced uses of CNNs, including fine-tuning, self-supervision, and few- and zero-shot learning.

Resources