About half a year ago, I organized all my deep learning-related videos in a handy blog post to have everything in one place.

Since many people liked this post, and because I like to use my winter break to get organized, I thought I could free two birds with one key by compiling this list below.

Here, you find a list of approximately 90 machine learning lectures I recorded in 2020 and 2021! Once again, I hope this is useful to you!

PS: Of course, all code examples are in Python :)

Table of Contents

Part 1: Introduction

L01 - Course overview, introduction to machine learning

Videos Material
1 🎥 1.1 Course overview (30:41) 📝 Slides

📝 Notes
2 🎥 1.2 What is Machine Learning (20:13)
3 🎥 1.3 Categories of Machine Learning (15:08)
4 🎥 1.4 Notation (30:07)
5 🎥 1.5 ML applications (16x:25)
6 🎥 1.6 ML motivation (33:07)

L02 - Introduction to Supervised Learning and k-Nearest Neighbors Classifiers

Videos Material
1 🎥 2.1 Introduction to NN (21:00) 📝 Slides

📝 Notes
2 🎥 2.2 Nearest neighbor decision boundary (25:40)
3 🎥 2.3 K-nearest neighbors (14:13)
4 🎥 2.4 Big O of K-nearest neighbors (38:23)
5 🎥 2.5 Improving k-nearest neighbors (26:52)
6 🎥 2.6 K-nearest neighbors in Python (50:12) 🎮 02_knn_demo.ipynb

Part 2: Computational foundations

L03 - Using Python

Videos Material
1 🎥 3.1 (Optional) Python overview (22:57) 📝 Notes
2 🎥 3.2 (Optional) Python setup (19:21)
3 🎥 3.3 (Optional) Running Python code (32:00)

L04 - Introduction to Python’s scientific computing stack

Videos Material
1 🎥 4.1 Intro to NumPy (31:42) 🎮 04_scipython__code.ipynb
2 🎥 4.2 NumPy Array Construction and Indexing (16:09)
3 🎥 4.3 NumPy Array Math and Universal Functions (24:55)
4 🎥 4.4 NumPy Broadcasting (4:38)
5 🎥 4.5 NumPy Advanced Indexing – Memory Views & Copies (15:15)
6 🎥 4.6 NumPy Random Number Generators (12:39)
7 🎥 4.7 Reshaping NumPy Arrays (10:45)
8 🎥 4.8 NumPy Comparison Operators and Masks (9:13)
9 🎥 4.9 NumPy Linear Algebra Basics (11:46)
10 🎥 4.10 Matplotlib (19:47)

L05 - Data preprocessing and machine learning with scikit-learn

Videos Material
1 🎥 5.1 Reading a Dataset from a Tabular Text File (24:11) 📝 Slides

🎮 05-preprocessing-and-sklearn__notes.ipynb
2 🎥 5.2 Basic data handling (30:27)
3 🎥 5.3 Object Oriented Programming & Python Classes (21:47)
4 🎥 5.4 Intro to Scikit-learn (12:19)
5 🎥 5.5 Scikit-learn Transformer API (47:01)
6 🎥 5.6 Scikit-learn Pipelines (26:16)

Part 3: Tree-based methods

L06 - Decision trees

Videos Material
1 🎥 6.1 Intro to Decision Trees (25:04) 📝 Slides

📝 Notes
2 🎥 6.2 Recursive algorithms & Big-O (38:19)
3 🎥 6.3 Types of decision trees (27:34)
4 🎥 6.4 Splitting criteria (47:53)
5 🎥 6.5 Gini & Entropy versus misclassification error (21:02)
6 🎥 6.6 Improvements & dealing with overfitting (33:11)
7 🎥 6.7 Code Example (18:44) 🎮 06-trees_demo.ipynb

🎮 06-trees_demo.ipynb

L07 - Ensemble methods

Videos Material
1 🎥 7.1 Intro to ensemble methods (15:06) 📝 Slides

📝 Notes

🎮 07_code-from-slides.ipynb
2 🎥 7.2 Majority Voting (23:31)
3 🎥 7.3 Bagging (37:45)
4 🎥 7.4 Boosting and AdaBoost (39:39)
5 🎥 7.5 Gradient Boosting (1:04:04)
6 🎥 7.6 Random Forests (32:28)
7 🎥 7.7 Stacking (34:12)

Part 4: Model evaluation

L08 - Model evaluation 1 – overfitting

Videos Material
1 🎥 8.1 Intro to overfitting and underfitting (21:16) 📝 Slides

📝 Notes
2 🎥 8.2 Intuition behind bias and variance (15:34)
3 🎥 8.3 Bias-Variance Decomposition of the Squared Error (30:50)
4 🎥 8.4 Bias and Variance vs Overfitting and Underfitting (7:22)
5 🎥 8.5 Bias-Variance Decomposition of the 0/1 Loss (23:21)
6 🎥 8.6 Different Uses of the Term “Bias” (17:47)

L09 - Model evaluation 2 – confidence intervals

Videos Material
1 🎥 9.1 Introduction (21:22) 📝 Slides

📝 Notes
2 🎥 9.2 Holdout Evaluation (28:59) 🎮 09-eval2-ci__1_distribution-and-subsampling.ipynb
3 🎥 9.3 Holdout Model Selection (7:13)
4 🎥 9.4 ML Confidence Intervals via Normal Approximation (16:17)
5 🎥 9.5 Resampling and Repeated Holdout (19:27) 🎮 09-eval2-ci__2_holdout-and-repeated-sampling.ipynb

🎮09-eval2-ci__3_pessimistic-bias-in-holdout.ipynb
6 🎥 9.6 Bootstrap Confidence Intervals (28:32) 🎮 09-eval2-ci__4-confidence-intervals_iris.ipynb

🎮 09-eval2-ci__4-confidence-intervals_mnist.ipynb
7 🎥 9.7 The .632 and .632+ Bootstrap methods (29:16)

L10 - Model evaluation 3 – cross-validation and model selection

Videos Material
1 🎥 10.1 Cross-validation lecture overview (11:16) 📝 Slides

📝 Notes
2 🎥 10.2 Hyperparameters (17:50)
3 🎥 10.3 k-fold CV for model evaluation (27:40)
4 🎥 10.4 k-fold CV for model eval. code examples (21:13) 🎮 10_04_kfold-eval.ipynb
5 🎥 10.5 k-fold CV for model selection (17:27)
6 🎥 10.6 k-fold CV for model evaluation code examples (25:14) 🎮 10_06_kfold-sele.ipynb
7 🎥 10.7 k-fold CV 1-standard error method (12:26)
8 🎥 10.8 k-fold CV 1-standard error method code example (9:10) 🎮 10_08_1stderr.ipynb

L11 - Model evaluation 4 – algorithm selection

Videos Material
1 🎥 11.1 Lecture Overview (12:37) 📝 Slides

📝 Note
2 🎥 11.2 McNemar’s Test for Pairwise Classifier Comparison (20:45)
3 🎥 11.3 Multiple Pairwise Comparisons (7:12)
4 🎥 11.4 Statistical Tests for Algorithm Comparison (8:15)
5 🎥 11.5 Nested CV for Algorithm Selection (17:19)
6 🎥 11.6 Nested CV for Algorithm Selection Code Example (24:34) 🎮 11-eval4-algo__nested-cv_compact.ipynb

🎮 11-eval4-algo__nested-cv_verbose1.ipynb

🎮 11-eval4-algo__nested-cv_verbose2.ipynb

L12 - Model evaluation 5 – evaluation and performance metrics

Videos Material
1 🎥 12.0 Lecture Overview (7:56) 📝 Slides
2 🎥 12.1 Confusion Matrix (28:08) 🎮 12_1_confusion-matrix.ipynb
3 🎥 12.2 Precision, Recall, and F1 Score (11:47) 🎮 12_2_pre-recall-f1.ipynb
4 🎥 12.3 Balanced Accuracy (9:38) 🎮 12_3_balanced-acc.ipynb
5 🎥 12.4 Receiver Operating Characteristic (18:37) 🎮 12_4_roc.ipynb
6 🎥 12.5 Extending Binary Metric to Multiclass Problems (21:11)

Part 5: Dimensionality reduction

L13 - Feature selection

Videos Material
1 🎥 13.0 Introduction to Feature Selection (16:09) 📝 Slides
2 🎥 13.1 The Different Categories of Feature Selection (11:38)
3 🎥 13.2 Filter Methods for Feature Selection – Variance Threshold (19:52) 🎮 01_variance-threshold.ipynb
4 🎥 13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (23:32) 🎮 02_lasso-path.ipynb
5 🎥 13.3.2 Decision Trees & Random Forest Feature Importance (39:42) 🎮 03_random-forest.ipynb
6 🎥 13.4.1 Recursive Feature Elimination (28:51) 🎮 04_recursive-feature-elimination.ipynb
7 🎥 13.4.2 Feature Permutation Importance (16:55)
8 🎥 13.4.3 Permutation importance code example (27:37) 🎮 05_permutation-importance.ipynb

🎮 06_random_feat_as_control.ipynb

🎮 07_perm-imp-with-correlated-feats.ipynb

🎮 08_sequential-feature-selection.ipynb
9 🎥 13.4.4 Sequential feature selection (29:59)
10 🎥 13.4.5 Sequential feature selection code example (23:35) 🎮 08_sequential-feature-selection.ipynb

L14 - Feature extraction

TBD: I am planning to add more videos some time in the future when time permits. You can subscribe to my YouTube channel to get notified.

Part 6: Bayesian methods

L15 - Introduction to Bayesian methods for machine learning

TBD

L16 - Applying naive Bayes

TBD