|
|
Deep learning tutorialTutorial on deep learning
The tutorial will be an introduction to the general domain of deep learning. The attendees are expected to have a basic knowledge of Python programming, basic probability concepts together with linear algebra formalism (vector manipulation). There will be two types of classes : theoretical presentations and computer lab implementations (in Python). To attend the tutorial registration is mandatory (click here before the deadline cf. "Dates and deadlines" page here).
Presenter: Gabriel Turinici
Title: 'Deep learning : from mathematical setting to python implementations"
Content (if time allows):
1/ Deep learning: major applications, references, culture
2/ Types of approaches: supervised, reinforcement, unsupervised
3/ Neural networks: presentation of objects: neurons, operations, loss function, optimization, architecture
4/ Stochastic optimization algorithms and proof of convergence of SGD
5/ Gradient computation by back-propagation
6/ Implementation in "pure Python" of a dense layer network
7/ Convolutional networks (CNN): filters, layers, architectures.
8/ Keras implementation of a CNN.
9/ Techniques: regularization, hyperparameters
10/ Unsupervised deep learning: generative networks and generative AI (GAN, VAE), Stable diffusion, LLMs
Requirements :
- python programming
- fair matematical knowledge of algebraic calculus and probability
Links to slides and programs used in the course
- presentation document : will be available soon, participants: contact us in the meantine. gradient formulas
- pure Python example : working version ; solution
- Tensorflow equivalent: Python (txt file), notebook
- non-sequential models : Python(txt) , notebook
- LLM : chat (python txt file)
|
Online user: 3 | Privacy |