Channel Avatar

Alfredo Canziani (ε†·εœ¨) @UCupQLyNchb9-2Z5lmUOIijw@youtube.com

39K subscribers - no pronouns :c

Music, math, and deep learning from scratch


01:07
Chapter 2, video 4–6
58:59
06 – Optimisation and gradient ascent
59:12
05 – Multi-class perceptron, binary and multi-class logistic regression
56:05
04 – Binary classifier evaluation, binary Perceptron
00:46
Chapter 1, video 1–3
01:00:36
03 – NaΓ―ve Bayes parameters estimation and Laplace smoothing
01:06:41
02 – Discrete probability recap, NaΓ―ve Bayes classification
02:48
00 – Course introduction
01:05:08
01 – Course first part recap, NaΓ―ve Bayes intro
01:43:43
14 – From latent-variable EBM (K-means, sparse coding) to target prop to autoencoders, step-by-step
53:14
07 – Classification, an energy perspective – PyTorch 5-step training code
01:47:39
06 – Classification, an energy perspective – Backprop and contrastive learning
50:30
05 – Classification, an energy perspective – Notation and introduction
01:07:19
03 – Inference with neural nets
02:09
00 – Intro to NYU Deep Learning Fall 2022 playlist
01:05:28
10P – Non-contrastive joint embedding methods (JEMs) for self-supervised learning (SSL)
56:52
09P – Contrastive joint embedding methods (JEMs) for self-supervised learning (SSL)
02:12:36
14L – Lagrangian backpropagation, final project winners, and Q&A session
01:51:32
13L – Optimisation for Deep Learning
01:54:23
07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE
01:54:44
08L – Self-supervised learning and variational inference
02:00:29
09L – Differentiable associative memories, attention, and transformers
01:14:45
14 – Prediction and Planning Under Uncertainty
01:48:54
06L – Latent variable EBMs for structured prediction
01:51:31
05L – Joint embedding method and latent variable energy based models (LV-EBMs)
01:01:22
13 – The Truck Backer-Upper
51:41
04L – ConvNet in practice
01:59:48
03L – Parameter sharing: recurrent and convolutional nets
01:42:27
02L – Modules and architectures
01:51:04
01L – Gradient descent and the backpropagation algorithm
01:10:23
12 – Planning and control
01:57:56
12L – Low resource machine translation
57:34
11 – Graph Convolutional Networks (GCNs)
01:36:13
10L – Self-supervised learning in computer vision
01:55:04
11L – Speech recognition and Graph Transformer Networks
01:12:01
10 – Self / cross, hard / soft attention and the Transformer
01:07:51
09 – AE, DAE, and VAE with PyTorch; generative adversarial networks (GAN) and code
01:00:35
08 – From LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder
56:42
07 – Unsupervised learning: autoencoding the targets
50:18
01 – History and resources
02:10
Behind the scenes
01:04:49
06 – Latent Variable Energy Based Models (LV-EBMs), training
10:42
05.2 – But what are these EBMs used for?
01:01:05
05.1 – Latent Variable Energy Based Models (LV-EBMs), inference
01:05:36
04.2 – Recurrent neural networks, vanilla and gated (LSTM)
01:09:13
04.1 – Natural signals properties and the convolution
01:01:54
02 – Neural nets: rotation and squashing
01:05:48
03 – Tools, classification with neural nets, PyTorch implementation
01:11:24
Supervised and self-supervised transfer learning (with PyTorch Lightning)
58:57
Week 15 – Practicum part B: Training latent variable energy based models (EBMs)
59:05
Week 15 – Practicum part A: Inference for latent variable energy based models (EBMs)
47:02
Matrix multiplication, signals, and convolutions
01:11:28
Week 14 – Practicum: Overfitting and regularization, and Bayesian neural nets
02:07:31
Week 14 – Lecture: Structured prediction with energy based models
02:00:23
Week 13 – Lecture: Graph Convolutional Networks (GCNs)
01:10:02
Week 13 – Practicum: Graph Convolutional Neural Networks (GCN)
01:40:57
Week 12 – Lecture: Deep Learning for Natural Language Processing (NLP)
01:18:02
Week 12 – Practicum: Attention and the Transformer
01:53:44
Week 11 – Lecture: PyTorch activation and loss functions
01:23:19
Week 11 – Practicum: Prediction and Policy learning Under Uncertainty (PPUU)