Mark’s Math
  • Scholarship
  • Viz
  • Class
  • Legacy

Class presentations

You can find the links to our daily class presentations below. Generally, I’ll type these up in sufficient detail for the interested student to follow along. Of course, I didn’t invent any of this and there are plenty of other references. Here are some other references for this material:

  • Open Intro Statistics A free and open source introduction to elementary statistics
  • APEX Calculus A free, open source, and fairly standard Calculus textbook
  • A First Course in Linear Algebra A free and open source text in Linear Algebra
  • Mathematics for Machine Learning A text on the mathematics of machine learning made freely available by Cambridge University Press
  • The Elements of Statistical Learning A foundational text
  • An Introduction to Statistical Learning A toned-down version of the previous text including code in R and Python

Of course, I learned most of this material many years ago and some of the presentations bear my own preferences. All mistakes are my own.

Daily class presentations

  • Mon, Jan 12: Intro and KNN
    • Presentation
    • Comments
      Today, we’ll look at a couple groovy demos and jump right in to the K-Nearest Neighbor algorithm. We’ll also get an overview of the course objectives and take a look at the syllabus.
  • Wed, Jan 14: A quick review of Calc I
    • Presentation
    • Comments

      While Calc I is a requirement for this course, it’s not a bad idea to refresh ourselves on some of the topics that we learn in that class. We’ll do exactly that today, with a focus on derivatives and optimization.

  • Wed, Jan 16: An overview of Calc III
    • Presentation
    • Comments

      Today, we’ll take a look at derivatives and optimization of multivariable functions, which is a small part of what’s covered in Calculus III.

  • Wed, Jan 21: Linear systems and matrices
    • Presentation
    • Comments

      Last time, we learned that optimization of multivariable functions leads naturally to systems of equations; when your function quadratic, the system is linear. Today, we’ll recap that and push forward to studying linear systems in general.

  • Fri, Jan 23: Reduced row echelon form
    • Presentation
    • Comments

      Last time, we talked about linear systems and how they to the concept of a matrix. Today, we’ll focus on matrices themselves and their reduced row echelon form.

  • Mon, Feb 2: Linear transformations and inverse matrices
    • Presentation
    • Comments

      We’ve moved from systems of equations to their matrix representation. Now, we’re going to focus on matrix multiplication and how it can be used to describe certain types of functions called linear transformations. A key step along the way is the construction of the inverse transformation.

  • Wed, Feb 4: Determinants
    • Presentation
    • Comments

      Today we’re going talk about the determinant: a single number that tells us how a linear transformation induced by a square matrix distorts the space on which it acts. Among other things, this gives us a simple test for singularity.

  • Fri, Feb 6: Subspaces
    • Presentation
    • Comments

      Today we’ll talk about the concept of a subspace and show that the range of a linear transformation is a subspace that’s equal to span of the columns of the corresponding matrix. Ultimately, we’ll solve the corresponding linear regression problem by projecting onto the range.

  • Mon, Feb 9: Orthogonal projection via the dot product
    • Presentation
    • Comments

      Today we’ll introduce the dot project with a very important application - namely, orthogonal projection.

  • Wed, Feb 11: Intro to Data
    • Colab Notebook
    • Comments

      Today, we’re going to take a look at a Colab Notebook that describes how we think about data and how to work with it using Python.

  • Fri, Feb 13: Linear regression theory and practice
    • General examples
    • Massey ratings
    • Comments

      Today, we’ll summarize general linear regression from the perspective of linear algebra and take a look at a bunch of examples.

  • Mon, Feb 16: Practical issues
    • Presentation
      Comments

      Today, we’ll take a look at some of the practical issues that arise when you attempt actual work with messy, real-world data. We’ll do so in the context of a Kaggle competition that we’ll try ourselves next week.

  • Mon, Feb 23: Integration
    • Presentation
      Comments

      We’re going to shift gears today, step back into calculus and review integration. Our main application of integration will be to help us understand probability theory.

  • Wed, Feb 25: Lab 1 on Practical Regression
    • Lab 1
      Comments

      We’ve got our first lab today on Practical Regression. We’ll actually apply these ideas to a real world problem!

  • Friday, Feb 27: Numerical integration and probability
    • Presentation
      Comments

      Today, we’re going to take an overview of where we are and start the transition into probability theory so that we can do some logistic regression.

  • Mon, Mar 2: Discrete probability
    • Presentation
      Comments

      Recently, we reviewed the basics of integration and then we saw an overview of how that might be applied to probability theory. Today, we start probability theory in earnest with a discussion of discrete probability.

  • Wed, Mar 4: Continuous probability
    • Presentation
      Comments

      Last time, we started probability theory in earnest with a discussion of discrete probability. Today, we’ll discuss continuous probability with an emphasis on the normal distribution.

  • Fri, Mar 6: Maximum likelihood and logistic Regression
    • Presentation
      Comments

      We’ve been working to get to this point where we can understand the basics of logistic regression and that’s what we’ll do today. We’ll apply it when we get back together after Spring break.

  • Mon, Mar 16: Recap and practical logistics
    • Presentation
    • The Titanic Notebook
    • The NCAA Competition:
      • NCAA Brackets
      • Kaggle Brackets
      • Kaggle’s Competition
        Comments

        Seeing as how we just returned from Spring break, today we’re going to take a moment to review some of the stuff we’ve learned before we jump into applying it in a practical setting.

  • Wed, Mar 18: Lab 2 - Logistic regression for Kaggle’s NCAA bracket competition
    • Kaggle’s competition
    • Tournaments
    • NCAA brackets
    • Kaggle prediction brackets
    • The lab notebook
  • Fri, Mar 20: Eigenspaces
    • Presentation
      Comments

      We’ve pretty much finished up linear and logistic regression. Today, we’ll discuss the very basics of eigenvalues and eigenvectors, which is the last stuff on our exam next week. We should also be sure to take a quick look at the Review for Exam III

  • Mon, Mar 23: Eigenrating
    • Notes
      Comments

      Now that we’ve learned about eigenvalues and eigenvectors, we’re going to see our first data based application, namely eigenrating.

      While this material is again expressed in terms of the ranking of sports teams, it’s worth mentioning that the same technique also has application of to networks. When we apply this basic idea to the world wide web, in fact, we get Google’s Page Rank algorithm!

  • Mon, Mar 30: Diagonalization
    • Presentation
    • Notes on power iteration
      Comments

      Today, we’re going to discuss a major application of eigenvalues and eigenvectors, namely diagonalization of a matrix. This allows us to relate many matrices to a canonical form whose action is easier to understand.

      Among other things, this yields an algorithm to compute the dominant eigenvector of a matrix.

  • Wed, Apr 1: Principal component Analysis
    • Presentation
      Comments

      Today, we’ll discuss principal component analysis (or PCA), which is a dimensional reduction technique that’s a bit more advanced than simple variable selection.

  • Fri, Apr 3: Networks
    • Presentation
      Comments

      Networks model pairwise relationships between discrete elements within a system. We’ve used them a little bit before but, today, we’ll solidify our understanding of these important models so that we can use them in our study of Neural networks.

  • Mon, Apr 6: Expression Graphs
    • Presentation
      Comments

      Last time, we learned a bit about networks and graph theory. We ended with abstract syntax trees, which provide a graph theoretic way to view algebraic expressions. Since trees can be manipulated as data structures, this yields a framework for us to perform algebra on the computer.

      Today, we’re going to take a look at an alternative way to represent algebraic expressions as graphs - the so-called expression graph, also called a computation graph. This alternative is better suited for efficient numerical computation of values and derivatives.

  • Wed, Apr 8: Neural Networks
    • Presentation
    • Colab notebook for classification plots
      Comments

      Now that we’ve talked about both general networks and expression graphs, we’re in a good spot to discuss neural networks. We’ll focus today on the foundational feed forward neural network - how we represent it, how we compute with it, how we code it, how it’s optimized, and what can do with it. We’ll discuss variations on that foundational architecture a bit later.

 

Written, owned, hosted, and copyright 2025, Mark McClure