Here are my solutions to
exercise 2.
The Architecture of Neural Networks
Question
There is a way of determining the bitwise representation of a digit by adding an extra layer to the
three-layer network above. The extra layer converts the output from the previous layer into a binary
representation, as illustrated in the figure below. Find a set of weights and biases for the new
output layer. Assume that the first 3 layers of neurons are such that the correct output in the
third layer (i.e., the old output layer) has activation at least 0.99, and incorrect outputs have
activation less than 0.01.
Solution
This is converting a base-ten value to base-two value.
I first listed out all the possible numbers.
(9)10(8)10(7)10(6)10(5)10(4)10(3)10(2)10(1)10(0)10=(1001)2=(1000)2=(0111)2=(0110)2=(0101)2=(0100)2=(0011)2=(0010)2=(0001)2=(0000)2
Node One
I noticed that the first thing I could do it solve for the first binary digit. The first binary
digit tells us whether or not the number is odd or even. I found that the bias is unnecessary since
the dot product will produce either 1 or 0. The weights for the first node are as follows:
I am going to denote the weight due to number in the old output layer, j, wj
w9w8w7w6w5w4w3w2w1w0=1=0=1=0=1=0=1=0=1=0
So if the value is odd the w⋅x=1 and if even w⋅x=0.
Node Two
For node two it should only be activated if the numbers are {2,3,6,7}
w9w8w7w6w5w4w3w2w1w0=0=0=1=1=0=0=1=1=0=0
Node Three
For node three it should only be activated if the numbers are {4,5,6,7}
w9w8w7w6w5w4w3w2w1w0=0=0=1=1=1=1=0=0=0=0
Node Four
For node four it should only be activated if the numbers are {8,9}
w9w8w7w6w5w4w3w2w1w0=1=1=0=0=0=0=0=0=0=0
Again, there are no biases for any of the nodes.