- #1
Peter_Newman
- 155
- 11
Good afternoon,
I am currently working on Neural Networks and I am reading an introduction by Jeff Heaton (Neural Networks in Java).
Now there are two tasks there whose solutions interest me. The first task is about applying the Hebbs rule. In the book it is given wrong because of a typo but I just googled the Hebbs rule and found it in "correct" form:
##\Delta w_{ij} = \mu \cdot a_i \cdot a_j##
##\Delta w_{ij}## weigth for the connection from neuron ##i## to ##j##
##\mu## learning rate
##a_i, a_j## activation of each neuron
In the first task it says now: Use the Hebbs rule to calculate the adjustment of the weighting, given as specifications: Two neurons N1 and N2, N1 to N2 weight: 3, N1 Activation: 2, N2 Activation: 6
I have now applied the rule bluntly, additionally I have to update the old weighting, therefore ##w_{new} = w_{old} + \Delta w_{ij}## If I do this I will come up:
##w_{new} = 3 + 1*2*6## I have assumed here that the learning rate ##\mu## is 1!
In the second task it says: Use the delta rule to calculate the adjustment of the weighting, given as specifications: Two neurons N1 and N2, N1 to N2 weight: 3, N1 Activation: 2, N2 Activation: 6, Expected: 5.
The delta rule is given in the book as follows:
##\Delta w_{ij} = 2\cdot\mu \cdot x_i \cdot (ideal-actual)_j##
The following is then added:
##\Delta w_{ij}## weigth for the connection from neuron ##i## to ##j##
##\mu## learning rate
The variable ideal represents the desired output of the ##j## neuron. The variable actual represents the actual output of the ##j## neuron. As a result (ideal - actual) is the error. ##x_i## Input for the actual neuron one is looking for (from Video)Alternatively, I found a video of Jeff Heaton, in which he explains this at this point (from minute 5:00, see )
I'm not sure about this task, because overall the term "activation" confuses me a bit. But if I understand the formula correctly, then this is ##w_{new} = w_{old} + \Delta w_{ij}##, where ##\Delta w_{ij}= 2\cdot \mu\cdot x_i\cdot(ideal-actual)_j## From this follows (for me): ##w_{new} = w_{old} + 2\cdot \mu\cdot x_i\cdot(ideal-actual)_j = 2\cdot 1\cdot 2\cdot (5-6)## I have assumed here that the learning rate ##\mu## is 1!
I'm not sure that's right. I'd be curious to hear your opinions.
I also found the book on Google Books: I have included the corresponding page right here: Google Books Link
Important: This is not homework! I bought the book out of interest and I just read it and do the tasks in the book.
I am currently working on Neural Networks and I am reading an introduction by Jeff Heaton (Neural Networks in Java).
Now there are two tasks there whose solutions interest me. The first task is about applying the Hebbs rule. In the book it is given wrong because of a typo but I just googled the Hebbs rule and found it in "correct" form:
##\Delta w_{ij} = \mu \cdot a_i \cdot a_j##
##\Delta w_{ij}## weigth for the connection from neuron ##i## to ##j##
##\mu## learning rate
##a_i, a_j## activation of each neuron
In the first task it says now: Use the Hebbs rule to calculate the adjustment of the weighting, given as specifications: Two neurons N1 and N2, N1 to N2 weight: 3, N1 Activation: 2, N2 Activation: 6
I have now applied the rule bluntly, additionally I have to update the old weighting, therefore ##w_{new} = w_{old} + \Delta w_{ij}## If I do this I will come up:
##w_{new} = 3 + 1*2*6## I have assumed here that the learning rate ##\mu## is 1!
In the second task it says: Use the delta rule to calculate the adjustment of the weighting, given as specifications: Two neurons N1 and N2, N1 to N2 weight: 3, N1 Activation: 2, N2 Activation: 6, Expected: 5.
The delta rule is given in the book as follows:
##\Delta w_{ij} = 2\cdot\mu \cdot x_i \cdot (ideal-actual)_j##
The following is then added:
##\Delta w_{ij}## weigth for the connection from neuron ##i## to ##j##
##\mu## learning rate
The variable ideal represents the desired output of the ##j## neuron. The variable actual represents the actual output of the ##j## neuron. As a result (ideal - actual) is the error. ##x_i## Input for the actual neuron one is looking for (from Video)Alternatively, I found a video of Jeff Heaton, in which he explains this at this point (from minute 5:00, see )
I'm not sure about this task, because overall the term "activation" confuses me a bit. But if I understand the formula correctly, then this is ##w_{new} = w_{old} + \Delta w_{ij}##, where ##\Delta w_{ij}= 2\cdot \mu\cdot x_i\cdot(ideal-actual)_j## From this follows (for me): ##w_{new} = w_{old} + 2\cdot \mu\cdot x_i\cdot(ideal-actual)_j = 2\cdot 1\cdot 2\cdot (5-6)## I have assumed here that the learning rate ##\mu## is 1!
I'm not sure that's right. I'd be curious to hear your opinions.
I also found the book on Google Books: I have included the corresponding page right here: Google Books Link
Important: This is not homework! I bought the book out of interest and I just read it and do the tasks in the book.