Math for Machine Learning
Math 295 in Spring 2025
with professor Mark McClure
Handy links
- Syllabus
- Schedule
- The textbook
- Colab
- MyOpenMath
- The Math Lab!
Handouts
Colab notebooks
Demos
Class notes
Daily class notes with pointers to sources
Sources
The overall structure of these class notes largely follows the text Math for Machine Learning. The specific content and notation, though, follows a couple of open source textbooks fairly closely and takes some material directly from those texts per their open source licenses.
Here are a few more details and acknowledgements for those texts:
Linear Algebra from FCLA
The material on linear algebra is modeled after A First Course in Linear Algebra (FCLA) by Rob Beezer. The notation and definitions largely follow’s that of FCLA and some of the examples are taken directly from FCLA, per its use of the GFDL GNU Free Documentation License.
If you find yourself wondering about some of the linear algebra terminology, you might consult FCLA’s list of notation and its list of other definitions.
Calculus from APEX
I plan to base some of the calculus material on Apex Calculus per its Creative Commons license.Presentations and labs
- Day 0: Introduction
- Presentation
Comments
Today, we’ll look at a few groovy demos, get an overview of the course objectives, and look at the syllabus.
- Day 1: Single variable calculus
- Presentation
Comments
Today, we’ll look at a summary of the single variable calculus that you learn in Calculus I, with an emphasis on the the things we’ll need for this class. In addition, we’ll discuss some techniques of numerical analysis that are part of single variable calculus, but not emphasized in most Calculus I courses.
Note that MyOpenMath HW over this material was assigned on the first day of class.
- Day 2: Intro lab on Numerical Calculus
- The lab V1 and The lab V2
Comments
Today, we’ll mostly just play on the computer a bit together. After I give a little introductory spiel, you’ll read through notebook and try both exercises.
You will learn some valuable tools here, like how to perform numerical integration and optimization on the computer with Python. Like most labs, this first one is meant to be low stress. If you come to class, work on the lab, and show me some progress, you’ll earn 20 points!
Note: There’s a second version of the lab which is scaled a back a bit from the first. In particular,
- The second version has a more detailed description of basic Python syntax,
- Removes all reference to integration and the normal distribution,
- Replaces the integration material with a discussion of the bisection method for root finding,
- Uses fewer libraries; in particular deleting reference to symbolic or graphics libraries.
- Day 3: Multivariable calculus
- Presentation (With rotatable 3D images)
- Static presentation (Non-rotatable 3D images)
Comments
Today we’ll talk about the basics of Multivariable calculus and doing mathematics in higher dimensions. That includes a careful discussion of three-dimensional space and the optimization of multivariable functions using partial derivatives. We’ll also use this material to introduce one of our major machine learning techniques, namely linear regression.
Note that there is a 10 problem MyOpenMath HW over this material worth 13 points.
- Day 4: Systems and Matrices
- Presentation
Comments
Today we jump into linear algebra as a general tool to solve the kinds of equations that arise in linear regression.
There is a 4 problem MyOpenMath HW over this material worth 12 points.
- Day 5: Vector Spaces - \(\mathbb R^n\) and More!
- Presentation
Comments
Today, we’ll dive deeply into the vector space \(\mathbb R^n\) and discuss the abstract concept of a vector space in general.
In lieu of HW, you should focus on the review sheet!
- Day 8: Linear transformations
- Presentation
Comments
After getting the exam back, we’ll jump into linear transformations on general vector spaces and on Euclidean space and see how far we get.
There is a 10 point MyOpenMath assignment with 6 problems over this material.
- Day 9: Matrix inverses
- Presentation
Comments
Today, we’re going to discuss matrix inverses and related things like determinants. This will give us a opportunity to review linear independence and non-singular matrices as well.
There is a 9 point MyOpenMath assignment with three questions over this material.
- Day 10: Geometry of Determinants
- Presentation
Comments
Today, we’re going to try to understand why determinants behave the way they do and we’re going to use some geometry to do it!
There is a 9 point MyOpenMath assignment with four questions over this material.
- Day 11: Norms and inner products
- Presentation
Comments
Today, we’re going to introduce the dot product and it’s generalization the inner product. We’ll use these things to think about the geometry of a vector space.
There is an 8 point MyOpenMath assignment with three questions over this material.
- Day 12: Orthogonal Projection
- Presentation
Comments
Today, we’re going to see how to project points in a vectors space orthogonally (or perpendicularly) onto a subspace. We’ll also get very applied when we see how this provides a fresh look at the least squares problem and regression.
There’s no MyOpenMath assignment today. There is a problem sheet, though, that mostly focuses on problems that might make it on the exam next week.
- Day 13: Linear algebra and regression lab
- Massey Ratings (Context)
- Linear algebra, regression, and Massey rating (The lab)
Comments
We’ve got our second lab today, which will cover the basics of numerical linear algebra with Python and the use of linear regression in sport’s rankings. First we’ll go over a demo on Massey rating and then we’ll jump into the lab.
- Day 12: Logistic regresssion
- Presentation
Comments
Today, we’re going to switch from linear regression to logistic regression - a technique for dealing with categorical outputs.
There’s no MyOpenMath assignment today; this stuff will appear in computer work a little down the road. There is, however, a review sheet for the exam this Friday.