View profile

OpenMLU Newsletter - Issue #1

OpenMLU Newsletter
OpenMLU Newsletter - Issue #1
By Sanny Kim • Issue #1 • View online
Hi everyone,
This is the first edition of the OpenMLU newsletter 🥳 . With this newsletter I’m hoping to curate and highlight ML resources that will help you become a better ML learner and practitioner.
The first theme of this newsletter will be hidden gems 💎. Everyone has heard about deeplearning.ai, fast.ai or CS231n, but did you know you can access Stanford’s CS224w Machine Learning with Graphs or download the book Elements of Causal Inference entirely for free? Keep reading to learn about twenty of the best ML resources that have been under the radar and deserve more love. 
If you have any questions, feedback or suggestions for the next issue, feel free to let me know! You can always reach me via 📩 .

This edition will focus on Causal Inference, Computer Vision, Deep Learning, Graphs, ML Engineering and Robotics. Next time the focus will be on topics like GOFAI, NLP and Reinforcement Learning. 
Causal Inference
An introduction to causal inference and some of its connections to ML.
An introduction to causal inference and some of its connections to ML.
  • Elements of Causal Inference (2017): A textbook that introduces the reader to the world of causality and some of its connections to Machine Learning. 200 pages of content on the cause-effect problem, multivariate causal models, hidden variables, time series and more. Alternatively, this 4-part lecture series by Peters goes through a lot of the same topics from the book. And for a more up-to-date survey of Causality x ML, Schölkopf’s paper will be your best bet.
  • MLSS Africa (2019): Beyond a collection of other great talks, this Machine Learning Summer School has recorded tutorials on Causal Discovery by Bernhard Schölkopf and Causal Inference in Everyday ML by Ferenc Huszár. For an even more recent causality tutorial by Schölkopf, head to this year’s virtual MLSS recordings.
  • Duke Causal Inference bootcamp (2015): Over 100 videos to understand ideas like counterfactuals, instrumental variables, differences-in-differences, regression discontinuity design etc. Imo the most approachable and complete videos series on Causal Inference (although it’s definitely rooted in an Economics perspective rather than CS/ML, i.e. a lot closer to Gary King’s work than Bernhard Schölkopf’s). 
  • Online Causal Inference Seminar (2020-present): For a collection of talks on current research, check out this virtual seminar. Talks by researchers like Andrew Gelman, Caroline Uhler or Ya Xu will give you an overview of the frontiers of causal inference in both industry and academia.
Computer Vision
The syllabus for TUM's Advanced Computer Vision course.
The syllabus for TUM's Advanced Computer Vision course.
  • UW The Ancient Secrets of CV (2018): Created by the first author of YOLO, this is likely the most well-rounded computer vision course as it not only you teaches the deep learning side of CV but also “older” methods like SIFT and optical flow.
  • UMichigan Deep Learning for CV (2019): An evolution of the beloved CS231n, this course is taught by one of its former head instructors Justin Johnson. Similar in many ways, the UMichigan version is more up-to-date and includes lectures on Transformers, 3D and video + Colab/PyTorch homework.
  • TUM Advanced Deep Learning for Computer Vision (2020): This course is great for anyone who has already taken an intro CV or DL course and wants to explore ideas like neural rendering, interpretability and GANs further. Taught by Laura Leal-Taixé and Matthias Niessner.
  • MIT Vision Seminar (2020-present): A set of recorded talks by various vision researchers presenting their current projects and thoughts. Devi Parikh’s talk on language, vision and applications of ML in creative pursuits as well as Yuval Bahat’s talk on explorable super resolution and some of its potential applications were quite fun. 
Deep Learning
Apparently a homage to the the HBO series Chernobyl.
Apparently a homage to the the HBO series Chernobyl.
  • Stanford Analyses/Theories of Deep Learning (2017 & 2019): Whether ML from a robustness perspective, overparameterization of neural nets or deep learning through random matrix theory, Stats 385 has a myriad of fascinating talks on theoretical deep learning. It’s a shame most of these fantastic lectures only have a few hundred views.
  • Princeton IAS’ Workshops (2019-2020): The Institute for Advanced Study has held a series of workshops on matters such as new directions in ML as part of its Special Year on Optimization, Statistics and Theoretical Machine Learning. Most of these wonderful talks can be found on their YouTube channel.
  • TUM Intro to DL (2020): If the advanced CV course is a bit too difficult for you, this course (taught by the same professors) is the corresponding prerequisite course you can take prior to starting the advanced version.
  • MIT Embodied Intelligence Seminar (2020-ongoing): Similar to MIT’s Vision Seminar, but organized by MIT’s embodied intelligence group. Oriol Vinyal’s talk on Deep Learning toolkit was really neat as it was basically a bird’s eye view of Deep Learning and its different submodules. 
Graphs
Hidden on this unassuming site, you can download all the lectures from CS224w.
Hidden on this unassuming site, you can download all the lectures from CS224w.
  • Stanford Machine Learning with Graphs (2019): The best kept secret in this issue is Stanford’s CS224w. While some of the lectures sporadically appear on YouTube, if you simply go to the above website, you can watch every lecture there. It covers topics like networks, data mining and graph neural networks. Taught by Jure Leskovec and Michele Catasta.
  • CMU Probabilistic Graphical Models (2020): If you want to learn more about PGMs, this course is the way to go. From the basics of graphical models to approximate inference to deep generative models, RL, causal inference and applications, it covers a lot of ground for just one course. Taught by Eric Xing.
ML Engineering
Time to train GPT-100.
Time to train GPT-100.
  • Stanford Massive Computational Experiments, Painlessly (2018): Did you ever feel confused about cluster computing, containers or scaling experiments in the cloud? Then this is the right place for you. As indicated by the name, you’ll come out of the course with a much better understanding of cloud computing, distributed tools and research infrastructure.
  • Full Stack Deep Learning (2019): This course is basically a bootcamp to learn best practices for your ML projects. From infrastructure to data management to model debugging to deployment, if there is one course you need to take to become a better ML Engineer, this is it.
Robotics
A massive collection of videos to learn robotics.
A massive collection of videos to learn robotics.
  • QUT Robot Academy (2017): A lot of robotics material online is concerned with the software side of the field, whereas this course (taught by Peter Corke) will teach you more about the basics of body dynamics, kinematics and joint control. Complementary resources that dive deeper into these concepts are Kevin Lynch’s 6-part MOOC (2017) and corresponding book (2019) on robot motion, kinematics, dynamics, planning, control and manipulation.
  • MIT Underactuated Robotics (2019): In this course Russ Tedrake will teach you about nonlinear dynamics and control of underactuated mechanical systems from a computational perspective. Throughout the lectures and readings you will apply newly acquired knowledge through problems expressed in the context of differential equations, ML, optimization, robotics and programming.
  • UC Berkeley Advanced Robotics (2019): With a bigger focus on ML, Pieter Abbeel guides you through the foundations of MDPs, Motion Planning, Particle Filters, Imitation Learning, Physics Simulations and many other topics. Particularly recommended to anyone with an interest in RL x Robotics.
Cheers
If you’ve made it this far, thanks for reading! The second issue is basically done as well, so expect the next edition to come out soon. I’ll try to send out a new issue once a month after that. If there’s anything you’d like to let me know, just hit reply on this issue 😄 .
Stay safe everyone!
Stay safe everyone!
Did you enjoy this issue?
Sanny Kim

Curating the internet's best ML resources.

In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue