MML - Review for Exam 4
We will have our fourth exam during the term next Friday, April 17. This review sheet is again meant to help you succeed on that exam.
Generally, I will expect solutions to the problems, as opposed to just answers. So, for example, if the answer to an optimization problem is \(y=5\), then the solution will consist of a clear explanation with correctly written supporting computations indicating why the answer is \(y=5\).
Note that there is a link at the bottom of this sheet labeled either “Start Discussion” or “Continue Discussion”, depending on the status of those discussions. Following that link will take you to my forum where you can create an account with your UNCA email address. You can ask questions and/or post responses and answers there.
The problems
Write down a careful definition of each of the following.
- Similarity of matrices \(A\) and \(B\)
- Principle component of a data matrix \(X\)
- The sigmoid function
- The ReLU function
- The softmax function.
Diagonalize the matrix \[A = \left( \begin{array}{cc} 5 & 1 \\ 1 & 5 \\ \end{array} \right)\] That is, express the matrix as a factorization \(A=SDS^{-1}\) where \(D\) is diagonal. You do not need to compute the inverse of \(S\) explicitly.
Diagonalize the matrix \[B = \left( \begin{array}{ccc} 1 & 1 & 4 \\ 0 & -2 & 0 \\ 0 & -1 & -3 \\ \end{array} \right)\] That is, express the matrix as a factorization \(B=SDS^{-1}\) where \(D\) is diagonal. You do not need to compute the inverse of \(S\) explicitly.
Comment: Probably, I’d give you the eigenvalues and eigenvectors for this problem.Compute \[A^{42}\begin{bmatrix}1 \\ 1\end{bmatrix}\] where \(A\) is the matrix in problem 2 and \[B^{42}\begin{bmatrix}1 \\ 1 \\ 1\end{bmatrix}\] where \(B\) is the matrix in problem 3 and
Suppose that \(A\) is similar to \(B\) and that \(\lambda\) is an eigenvalue of \(A\) with corresponding eigenvector \(\vec{x}\). Show that \(\lambda\) is also an eigenvalue of \(B\). What is the corresponding eigenvector of \(B\)?
Bob says that every \(3\times3\) matrix is similar to its own inverse. Provide a counter example showing that Bob is wrong.
Suppose that \(A\), \(B\) and \(C\) are \(n\times n\) matrices with \(A\) similar to \(B\) and \(B\) similar to \(C\). Use the definition of similarity to prove that \(A\) is similar to \(C\).
Consider the data matrix \(X\) given by
\(\mathbf{x}_1\) \(\mathbf{x}_2\) 1 3 2 2 3 1 - The principal components of \(X\) are the eigenvectors of what matrix?
- In what direction does the first principal component point?
It might help to draw a picture!
Let \(f(x) = \sin(x^3 + 2(x^3+1))\).
- Draw out an expression graph for \(f\) as it’s written. Do be sure to reuse any expressions that you can.
- Trace the evaluation of \(f(2)\) through the expression graph.
Let \(f(x,y) = \sin(x^3 + 2(x^3+y^2))\).
- Draw out an expression graph for \(f\) as it’s written. Do be sure to reuse any expressions that you can.
- Trace the evaluation of \(f(2,-1)\) through the expression graph.
The neural network shown in Figure 1 below consists of three layers:
- the input layer,
- one hidden layer, and
- the output layer.
Let’s also suppose that the input layer has a ReLU activation and the output layer has a sigmoid activation.
Note that the inputs are given. Use those inputs together with forward propagation to compute the value produced by this neural network.
- Consider the neural network shown in Figure 2. Probabilities for the three possible outputs of red, yellow or blue from top to bottom are to be computed by applying the softmax to the values currently indicated.
- Which value red, yellow, or blue is most likely?
- What is the probability associated with that value?
Figures
Your questions and answers
If you’d like to ask a question about or reply to a question on this sheet, you can do so by pressing the “Reply on Discourse” button below.