New YouTube video live! This one introduces what Automatic Differentiation — a technique that allows us to scale up the computation of derivatives to machine-learning scale — is.
A new video for my "Calculus for ML" course published on YouTube every Wednesday. Playlist is here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
Filtering by Tag: mathematics
Automatic Differentiation – Segment 3 of Subject 3, "Limits & Derivatives" – Machine Learning Foundations
Automatic Differentiation is a computational technique that allows us to move beyond calculating derivatives by hand and scale up the calculation of derivatives to the massive scales that are common in machine learning.
The YouTube videos in this segment, which we'll release every Wednesday, introduce AutoDiff in the two most important Python AutoDiff libraries: PyTorch and TensorFlow.
My growing "Calculus for ML" course is available on YouTube here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
The Power Rule on a Function Chain — Topic 61 of Machine Learning Foundations
This is the FINAL (of nine) videos in my Machine Learning Foundations series on the Derivative Rules. It merges together the Power Rule and the Chain Rule into a single easy step.
Next begins a chunk of long, meaty videos on Automatic Differentiation — i.e., using the PyTorch and TensorFlow libraries to, well, automatically differentiate equations (e.g., ML models) instead of needing to do it painstakingly by hand.
Because these forthcoming videos are so meaty, we're moving from a twice-weekly publishing schedule to a weekly one: Starting next week, we'll publish a new video to YouTube every Wednesday.
My growing "Calculus for ML" course available on YouTube here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
Advanced Exercises on Derivative Rules — Topic 60 of Machine Learning Foundations
Having now covered the product rule, quotient rule, and chain rule, we're well-prepared for advanced exercises that confirm your comprehension of all of the derivative rules in my Machine Learning Foundations series.
There’s just one quick derivative rule left after this — one that conveniently combines together two of the rules we’ve already covered — and then we’re ready to move on to the next segment of videos on Automatic Differentiation with PyTorch and TensorFlow.
New videos are published every Monday and Thursday to my "Calculus for ML" course, which is available on YouTube here.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub here.
The Chain Rule for Derivatives — Topic 59 of Machine Learning Foundations
Today's video introduces the Chain Rule — arguably the single most important differentiation rule for ML. It facilitates several of the most ubiquitous ML algorithms, such as gradient descent and backpropagation.
Gradient descent and backprop will be covered in great detail later in my "Machine Learning Foundations" video series. This video is critical for understanding those applications.
New videos are published every Monday and Thursday to my "Calculus for ML" course, which is available on YouTube.
More detail about my broader "ML Foundations" curriculum and all of the associated open-source code is available in GitHub.
The Quotient Rule for Derivatives — Topic 58 of Machine Learning Foundations
This is the penultimate Derivative Rule and then we're moving onward to AutoDiff with TensorFlow and PyTorch! The Quotient Rule is analogous to the Product Rule introduced on Monday but is for division instead of multiplication.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
The Product Rule for Derivatives
Today's video is on the Product Rule, a relatively advanced Derivative Rule. Only a couple such rules remain and then we move onward to Automatic Differentiation with PyTorch and TensorFlow.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
machinelearning,datascience,calculus,mathematics,python
Exercises on Derivative Rules — Topic 56 of Machine Learning Foundations
Today's YouTube video uses five fun exercises to test your understanding of the derivative rules we’ve covered so far: the Constant Rule, Power Rule, Constant-Multiple Rule, and Sum Rule.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
If you’d happen to like a detailed walkthrough of the solutions to all the exercises in this video, you can check out my Udemy course called Mathematical Foundations of Machine Learning. See jonkrohn.com/udemy
The Constant Multiple Rule for Derivatives
Continuing my short series on Differentiation Rules, today’s video covers the Constant Multiple Rule. This rule is often used in conjunction with the Power Rule, which was covered in the preceding video, released on Monday.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
The Power Rule for Derivatives
On Thursday, I published a video on the Constant Rule, the first video in a series on Differentiation Rules. Today, we continue the series with the Power Rule, arguably the most common and most important of all the rules.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
The Derivative of a Constant
This and the next several videos will provide you with clear and colorful examples of all of the most important differentiation rules. We kick these rules off with the Constant Rule.
The derivative rules are critical to machine learning as they allow us to find the derivatives of cost functions. These cost-function derivatives are concatenated into the "gradient" that we descend to allow ML models to learn.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
Derivative Notation
In today's YouTube video, we detail all of the most common notation for derivatives. This lays the foundation for a fun, immediately forthcoming series of videos covering all of the major differentiation rules. Enjoy!
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
How Derivatives Arise from Limits
In today's video, we use hands-on code demos in Python to find the slopes of curves with the Delta Method. While finding these slopes, we derive together — from first principles — the most important Differential Calculus formula.
This video is part of a thematic segment of videos on Differentiation. In the forthcoming videos, we’ll cover derivative notation and a series of useful rules for differentiation.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
The Delta Method
In today's video, we use a Python code demo to develop a working understanding of the Delta Method, a centuries-old technique that enables us to determine the slope of a curve.
This video is the first from a thematic segment of videos on Differentiation. In Thursday's video, we'll build on what we covered today to derive — and deeply understand — the most common, most important equation in differential calculus.
New videos are published every Monday and Thursday. The playlist for my "Calculus for ML" course is here.
More detail about my broader "ML Foundations" series and all of the associated open-source code is available in GitHub here.
Exercises on Limits
Final YouTube video from my thematic segment on Limits out today! It's a handful of comprehension exercises. Starting Thursday, we'll begin releasing videos from a new Calculus segment, on derivatives and differentiation.
We release new videos from my "Calculus for Machine Learning" course on YouTube every Monday and Thursday. The playlist is here.
Calculating Limits
Today's video introduces Limits, a key stepping stone toward understanding Differential Calculus. This one has lots of interactive Python code demos and paper-and-pencil exercises to ensure learning the subject is both engaging and fun.
We release new videos from my "Calculus for Machine Learning" course on YouTube every Monday and Thursday. The playlist is here.
Linear Algebra for Machine Learning: Complete Math Course on YouTube
At a high level, my ML Foundations content can be broken into four subject areas: linear algebra, calculus, probability/stats, and computer science. The first quarter of the content, on linear algebra, stands alone as its own discrete course and is available on YouTube today.
The playlist for my complete Linear Algebra for Machine Learning course is on YouTube here. There are a total of 48 videos in the course, each of which is provided in this blog post. Click through for all the detail!
Read MoreThe History of Data
Last month, I thought I was taking a risk by doing an episode on the History of Algebra, but it was an unusually popular episode! To follow up, today's Five-Minute Friday is on the four-billion-year History of Data — hope you enjoy it 😁
You can watch or listen here.