Final Review
Our class final exams are set according to UNCA’s Final Exam Schedule. According to that schedule, our final exam will be this friday, December 5 at 8:00 AM.
This problem sheet consists primarily of problems right off of past exams, together with a few new problems. You should treat this review sheet like the past review sheets. The final exam will have significant similarities to this review sheet so, if you can do well on this review sheet, you should be able to do well on the final.
Please study for the final! Don’t just look at the review sheet and figure that you already know how to do the problems. In my experience, final exams can have a significant impact on final grades.
Exam 1
- Write down the precise, mathematical definition of each of the following.
- Linear combination of a set of vectors
- Span of a set of vectors
- Write down a componentwise proof of the fact that vector addition is commutative. That is, if \(\mathbf{u}, \mathbf{v}\in\mathbb{R}^n\), then \[\mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u}.\]
Consider the matrix \[ \left(\begin{array}{rrrrr} -2 & -2 & 2 & 0 & -2 \\ 1 & 2 & 0 & -1 & -2 \\ 1 & 0 & -3 & -3 & 2 \\ 1 & 2 & 0 & -1 & 1 \end{array}\right) \] whose reduced row echelon is \[ \left(\begin{array}{rrrrr} 1 & 0 & 0 & 9 & 0 \\ 0 & 1 & 0 & -5 & 0 \\ 0 & 0 & 1 & 4 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{array}\right). \] Suppose the column vectors of the original matrix are \(\mathbf{v}_1\), \(\mathbf{v}_2\), \(\mathbf{v}_4\), \(\mathbf{v}_3\), and \(\mathbf{v}_5\).
- Find a non-trivial linear combination of the columns that sum to the zero vector \(\mathbf{0}\).
“Non-trivial” simply means that we’re not interested in the situation where all coefficients are zero. - (continued from previous page) Find a linearly independent subset of those column vectors whose span is the same as the full set of column vectors.
- Find a non-trivial linear combination of the columns that sum to the zero vector \(\mathbf{0}\).
Let \[ \mathbf{x} = \left(\begin{array}{r} 0 \\ -1 \\ -1 \end{array}\right), \: \mathbf{y} = \left(\begin{array}{r} 1 \\ 3 \\ -1 \end{array}\right), \: \text{and } \mathbf{z} = \left(\begin{array}{r} 0 \\ -2 \\ 1 \end{array}\right). \] Also let \[ \mathbf{u} = \left(\begin{array}{r} 1 \\ 3 \\ -4 \end{array}\right). \]
- Write down the matrix \(A\) whose columns are vectors \(\mathbf{x}\), \(\mathbf{y}\), \(\mathbf{z}\), and \(\mathbf{u}\).
- Find the reduced row echelon form of \(A\).
- Use your reduced row echelon form to express \(\mathbf{u}\) as a linear combination of \(\mathbf{x}\), \(\mathbf{y}\), and \(\mathbf{z}\) or explain why there is no such linear combination.
Exam 2
Write down careful and complete definitions of the following:
- Eigenvalue/Eigenvector pair for a matrix \(A\)
- Similarity of matrices
- Diagonalizable matrix
- Write down a \(3\times3\) matrix that
- stretches by the factor \(2\) in the direction of the vector \(\begin{bmatrix}2&1&1\end{bmatrix}^{\mathsf T}\),
- reflects in the direction \(\begin{bmatrix}2&-1&1\end{bmatrix}^{\mathsf T}\), and
- compresses by the factor \(1/2\) in the direction \(\begin{bmatrix}0&-1&1\end{bmatrix}^{\mathsf T}\).
- Let \(A\) denote the matrix \[
\begin{bmatrix}
3&0&0 \\ 1&2&0 \\ 1&1&1
\end{bmatrix}
\]
- Find the eigenvalues of \(A\).
- Find the eigenvector corresponding to the largest eigenvalue of \(A\).
- Suppose that \(A\) is similar to \(B\) with change of basis matrix \(S\) so that \[A = SBS^{-1}.\] Show that \(B\) is similar to \(A\). Be sure to clearly identify what the change of basis matrix is and why it works to establish the fact that \(B\) is similar to \(A\).
Quiz 3
Write down the definition of each of the following:
- Orthogonal vectors
- Orthogonal matrix
Let \(\mathbf u\) and \(\mathbf v\) denote the 3D vectors \[ \mathbf{u} = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} \: \text{ and } \: \mathbf{v} = \begin{bmatrix} 2 \\ -2 \\ 1 \end{bmatrix}. \]
- Verify that \(\mathbf u\) and \(\mathbf v\) are orthogonal.
- Find the matrix \(P\) that projects orthogonally onto the subspace spanned by \(\mathbf u\) and \(\mathbf v\).
You may write your answer to part b as a product of two matrices without expanding out that product.
Use the normal equations to find the least squares regression line for the points \[ \{(-1, 1), (0, 3), (2, 0)\}. \]
Inner products and orthogonal polynomials
Determine which of the following sets are vector spaces. For those that are not, specify exactly why not. To set notation, we write
- \(U\) to denote the set of all continuous functions mapping \([0,1]\to\mathbb R\) and
- \(\mathbb R^n\) to denote \[ \left\{ \begin{bmatrix}x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix}: x_1,x_2,\ldots,x_n \in \mathbb R \right\}. \]
- \(\{\mathbf{x}\in\mathbb R^5: x_2 + x_4 = 0\}\)
- \(\{\mathbf{x}\in\mathbb R^5: \|\mathbf{x}\| \leq 10\}\)
- \(\{f\in U: f(0) = 1\}\)
- \(\{f\in U: \int_0^1 f(x) \, dx = 0\}\)
Write down the definition of an inner product on a vector space \(V\).
Find a real value of \(b\) such that \(f(x) = x^2\) and \(g(x) = x^2 + b\), both restricted to the unit interval \([0,1]\), are orthogonal.
Your questions and solutions
If you’d like to ask a question about or reply to a question on this sheet, you can do so by pressing the “Reply on Discourse” button below.