Class Notes

Daily class notes with pointers to sources

Sources

The overall structure of these class notes largely follows the text Math for Machine Learning. The specific content and notation, though, follows a couple of open source textbooks fairly closely and takes some material directly from those texts per their open source licenses.

Here are a few more details and acknowledgements for those texts:

Linear Algebra from FCLA

The material on linear algebra is modeled after A First Course in Linear Algebra (FCLA) by Rob Beezer. The notation and definitions largely follow’s that of FCLA and some of the examples are taken directly from FCLA, per its use of the GFDL GNU Free Documentation License.

If you find yourself wondering about some of the linear algebra terminology, you might consult FCLA’s list of notation and its list of other definitions.

Calculus from APEX I plan to base some of the calculus material on Apex Calculus per its Creative Commons license.

Presentations and labs

  • Mon, Jan 13: Introduction
    • Presentation
    • Comments
      Today, we’ll look at a few groovy demos, get an overview of the course objectives, and look at the syllabus.
  • Wed, Jan 15: Single variable calculus
    • Presentation
    • Comments

      Today, we’ll look at a summary of the single variable calculus that you learn in Calculus I, with an emphasis on the the things we’ll need for this class. In addition, we’ll discuss some techniques of numerical analysis that are part of single variable calculus, but not emphasized in most Calculus I courses.

      Note that MyOpenMath HW over this material was assigned on the first day of class.

  • Fri, Jan 17: Intro lab on Numerical Calculus
    • The lab V1 and The lab V2
    • Comments

      Today, we’ll mostly just play on the computer a bit together. After I give a little introductory spiel, you’ll read through notebook and try both exercises.

      You will learn some valuable tools here, like how to perform numerical integration and optimization on the computer with Python. Like most labs, this first one is meant to be low stress. If you come to class, work on the lab, and show me some progress, you’ll earn 20 points!


      Note: There’s a second version of the lab which is scaled a back a bit from the first. In particular,

      • The second version has a more detailed description of basic Python syntax,
      • Removes all reference to integration and the normal distribution,
      • Replaces the integration material with a discussion of the bisection method for root finding,
      • Uses fewer libraries; in particular deleting reference to symbolic or graphics libraries.
  • Wed, Jan 22: Multivariable calculus
    • Presentation (With rotatable 3D images)
    • Static presentation (Non-rotatable 3D images)
    • Comments

      Today we’ll talk about the basics of Multivariable calculus and doing mathematics in higher dimensions. That includes a careful discussion of three-dimensional space and the optimization of multivariable functions using partial derivatives. We’ll also use this material to introduce one of our major machine learning techniques, namely linear regression.

      Note that there is a 10 problem MyOpenMath HW over this material worth 13 points.

  • Fri, Jan 24: Systems and Matrices
    • Presentation
    • Comments

      Today we jump into linear algebra as a general tool to solve the kinds of equations that arise in linear regression.

      There is a 4 problem MyOpenMath HW over this material worth 12 points.

  • Mon, Jan 27: Vector Spaces - \(\mathbb R^n\) and More!
    • Presentation
    • Comments

      Today, we’ll dive deeply into the vector space \(\mathbb R^n\) and discuss the abstract concept of a vector space in general.

      In lieu of HW, you should focus on the review sheet!

  • Mon, Feb 3: Linear transformations
    • Presentation
    • Comments

      After getting the exam back, we’ll jump into linear transformations on general vector spaces and on Euclidean space and see how far we get.

      There is a 10 point MyOpenMath assignment with 6 problems over this material.

  • Wed, Feb 5: Matrix inverses
    • Presentation
    • Comments

      Today, we’re going to discuss matrix inverses and related things like determinants. This will give us a opportunity to review linear independence and non-singular matrices as well.

      There is a 9 point MyOpenMath assignment with three questions over this material.

  • Fri, Feb 7: Geometry of Determinants
    • Presentation
    • Comments

      Today, we’re going to try to understand why determinants behave the way they do and we’re going to use some geometry to do it!

      There is a 9 point MyOpenMath assignment with four questions over this material.

  • Mon, Feb 10: Norms and inner products
    • Presentation
    • Comments

      Today, we’re going to introduce the dot product and it’s generalization the inner product. We’ll use these things to think about the geometry of a vector space.

      There is an 8 point MyOpenMath assignment with three questions over this material.

  • Wed, Feb 12: Orthogonal Projection
    • Presentation
    • Comments

      Today, we’re going to see how to project points in a vectors space orthogonally (or perpendicularly) onto a subspace. We’ll also get very applied when we see how this provides a fresh look at the least squares problem and regression.

      There’s no MyOpenMath assignment today. There is a problem sheet, though, that mostly focuses on problems that might make it on the exam next week.

  • Fri, Feb 14: Linear algebra and regression lab
    • Massey Ratings (Context)
    • Linear algebra, regression, and Massey rating (The lab)
    • Comments

      We’ve got our second lab today, which will cover the basics of numerical linear algebra with Python and the use of linear regression in sport’s rankings. First we’ll go over a demo on Massey rating and then we’ll jump into the lab.

  • Mon, Feb 17: Logistic regresssion
    • Presentation
    • Comments

      Today, we’re going to switch from linear regression to logistic regression - a technique for dealing with categorical outputs.

      There’s no MyOpenMath assignment today; this stuff will appear in computer work a little down the road. There is, however, a review sheet for the exam this Friday.

  • Mon, Feb 24: Practical regression
  • Wed, Feb 26: Issues in practical regresssion
    • Presentation
    • Comments

      Last time, we looked at a Colab notebook illustrating how to actually do linear regression in a real-world contest. We saw that there’s a lot to contend with that we haven’t even dealt with yet.

      Today, we’ll discuss some of the mathematical issues raised by that notebook.

      There’s no MyOpenMath assignment today; this stuff will appear in our lab this Friday, though!

  • Fri, Feb 28: Practial regression lab
  • Mon, Mar 3: Eigenspaces
    • Presentation
    • Comments

      Today, we’re going to meet eigenvalues and eigenvectors!

      There is a MyOpenMath HW over this material.

  • Wed, Mar 5: Eigenrating and prediction
    • Demo
    • Comments

      Today, we’re going to use eigenvalues and the dominant eigenvector to make some predicitons in sports!

      There’s no HW but there will be a lab on Friday.

  • Fri, Mar 7: Eigenrating Lab
    • Updated demo
    • Lab
    • Comments

      Today, we’re going to finish up our look at the (updated) Eigenrating page and push that further to do a lab, that you’ll actually turn in!

  • Wed, Mar 19: Diagonalization
    • Presentation
    • Comments

      Just before Spring break, we learned about eigenvalues and eigenvectors of \(n\) dimensional matrices and how they help us identify various subspaces of \(\mathbb R^n\) that are invariant under the action of the matrix. Today, we’re going to discuss how that allows us to express matrices in various forms, depending upon the basis that we use to describe \(\mathbb R^n\)

      There is a MyOpenMath HW over this material.

  • Mon, Mar 24: Principal Component Analysis
    • Presentation
    • Comments

      Today, we’re going to talk about Principal Component Analysis - a more powerful way to reduce the dimension of a problem than simple variable reduction.