Some neural network HW

  1. Let \(f(x) = \sin(x^3 + 2(x^3+1))\).

    1. Draw out an expression graph for \(f\) as it’s written. Do be sure to reuse any expressions that you can.
    2. Trace the evaluation of \(f(2)\) through the expression graph.
  2. Let \(f(x,y) = \sin(x^3 + 2(x^3+y^2))\).

    1. Draw out an expression graph for \(f\) as it’s written. Do be sure to reuse any expressions that you can.
    2. Trace the evaluation of \(f(2,-1)\) through the expression graph.
  3. Compute the convolution of the data \(D\) with the kernel \(K\) given by

\(D=\,\) 1 2 3 4 5 6

and

\(K=\,\) 1 -2 1 .
  1. Consider the two two-dimensional kernels \[ K_1 = \begin{bmatrix}1&1&1\\1&-8&1\\1&1&1\end{bmatrix} \quad \text{ and } \quad K_2 = \begin{bmatrix}1&1&1&1&1\\1&1&1&1&1\\1&1&-24&1&1\\1&1&1&1&1\\1&1&1&1&1\end{bmatrix}. \]
    1. Could these be appropriate for edge detection in image processing? Why or why not?
    2. What kind of difference might you expect in the behavior of these?
  1. The neural network shown in Figure 1 below consists of three layers:

Let’s also suppose that the input layer has a ReLU activation and the output layer has a sigmoid activation.

Note that the inputs are given. Use those inputs together with forward propagation to compute the value produced by this neural network.

A simple neural network image for computation
Figure 1: A neural network image