Here are my solutions to exercise 6.

### The Backpropagation Algorithm

### Part 1 - Backpropagation with a Single Modified Neuron

#### Question

Suppose we modify a single neuron in a feedforward network so that the output from the neuron is given by $f(∑_{j}w_{j}x_{j}+b)$, where $f$ is some function other than the sigmoid. How should we modify the backpropagation algorithm in this case?

#### Solution

We first would have to calculate the derviate for the function $f$ since it is needed for the backpropagation output error vector $δ_{L}$ and $δ_{l}$. But other than that the neural network does not need any tweaking. You may think that it needs tweaking because $δ_{j}$ is dependent on $f$ but we have defined $δ_{j}=∂z_{j}∂C $ and $δ_{j}=∂a_{j}∂C $. We did this because it makes our lives easier for this particular case!

From Michael Nielsen:

You might wonder why the demon is changing the weighted input $z_{j}$. Surely it'd be more natural to imagine the demon changing the output activation $a_{j}$, with the result that we'd be using $∂a_{j}∂C $ as our measure of error. In fact, if you do this things work out quite similarly to the discussion below. But it turns out to make the presentation of backpropagation a little more

algebraically complicated. So we'll stick with $δ_{j}=∂z_{j}∂C $ as our measure of error.

### Part 2 - Backpropagation with Linear Neurons

#### Question

Suppose we replace the usual non-linear $σ$ function with $σ(z)=z$ throughout the network. Rewrite the backpropagation algorithm for this case.

#### Solution

Since $σ(z)=z$ then $σ_{′}(z)=1$, so it follows that $δ_{L}=∇_{a}C∘σ_{′}(z_{L})=∇_{a}C$. Also, $δ_{l}=((w_{l+1})_{T}δ_{l+1})∘σ_{′}(z_{l})=(w_{l+1})_{T}δ_{l+1}$.