Fri: 11:00 a.m. - 1:00 p.m., Aula E2
Fri: 2:00 p.m. - 4:00 p.m., Aula B2
AI spring? Artificial Intelligence, Machine Learning, Deep Learning: facts, myths and a few reflections.
Fundamentals: Artificial Neural Networks [pdf]
Foundations of machine learning: dataset, representation, evaluation, optimization. Feed-forward neural networks as universal approximators.
Flow Graphs and Automatic Differentiation [pdf]
Tensorial representation, flow graphs. Automatic differentiation: graphs from graphs.
Aside 1: Tensor broadcasting [pdf]
From theoretical tensor algebra to actual computation: operators and automatic broadcasting.
Deep Networks [pdf]
Deeper networks: potential advantages and new challenges. Tensorial layerwise representation. Softmax and cross-entropy.
Shannon Entropy (Wikipedia)
Cross Entropy (Wikipedia)
Learning as Optimization [pdf]
Vanishing and exploding gradients. First and second order optimization, approximations, optimizers. Further tricks.
Aside 2: Exponential Moving Average [pdf]
Aside 3: Predictors [pdf]
From in-sample optimization to out-of-sample generalization.
Convolutional Networks [pdf]
Convolutional filter, filter banks, feature maps, pooling, layerwise gradients.
Aside 4: Hardware for Deep Learning [pdf]
Main differences bewtween CPUs and GPUs, SIMT parallelism, bus-oriented communication, a few caveats.
Deep Convolutional Neural Networks and Beyond [pdf]
Some insight into what happens in convolution layers. DCNN architectures. Transfer learning. Working in reverse: image generation. Generative adversarial networks. Autoencoders and segmentation. Object detection.