Class presentations
You can find the links to our daily class presentations below. Generally, I’ll type these up in sufficient detail for the interested student to follow along. Of course, I didn’t invent any of this and there are plenty of other references. Here are some other references for this material:
- Open Intro Statistics A free and open source introduction to elementary statistics
- APEX Calculus A free, open source, and fairly standard Calculus textbook
- A First Course in Linear Algebra A free and open source text in Linear Algebra
- Mathematics for Machine Learning A text on the mathematics of machine learning made freely available by Cambridge University Press
- The Elements of Statistical Learning A foundational text
- An Introduction to Statistical Learning A toned-down version of the previous text including code in R and Python
Of course, I learned most of this material many years ago and some of the presentations bear my own preferences. All mistakes are my own.
Daily class presentations
- Mon, Jan 12: Intro and KNN
- Presentation
Comments
Today, we’ll look at a couple groovy demos and jump right in to the K-Nearest Neighbor algorithm. We’ll also get an overview of the course objectives and take a look at the syllabus.
- Wed, Jan 14: A quick review of Calc I
- Presentation
Comments
While Calc I is a requirement for this course, it’s not a bad idea to refresh ourselves on some of the topics that we learn in that class. We’ll do exactly that today, with a focus on derivatives and optimization.
- Wed, Jan 16: An overview of Calc III
- Presentation
Comments
Today, we’ll take a look at derivatives and optimization of multivariable functions, which is a small part of what’s covered in Calculus III.
- Wed, Jan 21: Linear systems and matrices
- Presentation
Comments
Last time, we learned that optimization of multivariable functions leads naturally to systems of equations; when your function quadratic, the system is linear. Today, we’ll recap that and push forward to studying linear systems in general.
- Fri, Jan 23: Reduced row echelon form
- Presentation
Comments
Last time, we talked about linear systems and how they to the concept of a matrix. Today, we’ll focus on matrices themselves and their reduced row echelon form.
- Mon, Feb 2: Linear transformations and inverse matrices
- Presentation
Comments
We’ve moved from systems of equations to their matrix representation. Now, we’re going to focus on matrix multiplication and how it can be used to describe certain types of functions called linear transformations. A key step along the way is the construction of the inverse transformation.
- Wed, Feb 4: Determinants
- Presentation
Comments
Today we’re going talk about the determinant: a single number that tells us how a linear transformation induced by a square matrix distorts the space on which it acts. Among other things, this gives us a simple test for singularity.
- Fri, Feb 6: Subspaces
- Presentation
Comments
Today we’ll talk about the concept of a subspace and show that the range of a linear transformation is a subspace that’s equal to span of the columns of the corresponding matrix. Ultimately, we’ll solve the corresponding linear regression problem by projecting onto the range.
- Mon, Feb 9: Orthogonal projection via the dot product
- Presentation
Comments
Today we’ll introduce the dot project with a very important application - namely, orthogonal projection.
- Wed, Feb 11: Intro to Data
- Colab Notebook
Comments
Today, we’re going to take a look at a Colab Notebook that describes how we think about data and how to work with it using Python.
- Fri, Feb 13: Linear regression theory and practice
- General examples
- Massey ratings
Comments
Today, we’ll summarize general linear regression from the perspective of linear algebra and take a look at a bunch of examples.
- Mon, Feb 16: Practical issues
- Presentation
Comments
Today, we’ll take a look at some of the practical issues that arise when you attempt actual work with messy, real-world data. We’ll do so in the context of a Kaggle competition that we’ll try ourselves next week.
- Presentation
- Mon, Feb 23: Integration
- Presentation
Comments
We’re going to shift gears today, step back into calculus and review integration. Our main application of integration will be to help us understand probability theory.
- Presentation
- Friday, Feb 27: Numerical integration and probability
- The normal density function
- Integration is hard
- Riemann type sums
Comments
Today, we’re going to take an overview of where we are and start the transition into probability theory so that we can do some logistic regression.