Could really use some help with the notation in this example

  • Context: Graduate 
  • Thread starter Thread starter CraigH
  • Start date Start date
  • Tags Tags
    Example Notation
Click For Summary

Discussion Overview

The discussion revolves around understanding the notation and mathematical representation of artificial neural networks, specifically focusing on the summation formula used to calculate the values at hidden nodes based on input values and associated weights. Participants are exploring the implications of subscripts in the notation and how to interpret them in the context of the network's structure.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant describes the process of summing inputs at nodes and applying weights, expressing uncertainty about the meaning of subscripts in the weight notation.
  • Another participant interprets the subscripts, suggesting that the first subscript indicates the input layer and the second indicates the hidden layer.
  • There is a question about whether the weight notation w12 represents a product of weights from different layers, which is challenged by another participant who clarifies that weights are associated with connections, not nodes.
  • Participants discuss the specific meaning of w12, with one concluding that it represents the weight of the connection between input i=1 and hidden node j=2.
  • One participant expresses uncertainty and plans to revisit the topic later for further clarification.

Areas of Agreement / Disagreement

Participants generally agree on the interpretation of the notation, but there is some confusion regarding the relationship between weights and nodes, leading to differing views on how to evaluate the weights in the context of the formula.

Contextual Notes

Some participants express uncertainty about the notation and its implications, indicating a need for further clarification on the mathematical relationships within the neural network structure.

Who May Find This Useful

Readers interested in artificial neural networks, mathematical notation in machine learning, or those seeking clarification on the workings of neural network architectures may find this discussion beneficial.

CraigH
Messages
221
Reaction score
1
http://img59.imageshack.us/img59/6414/puag.png

The picture above is used to help understand artificial neural networks, however for now this doesn't matter as I'm only concerned with the maths.

A number is input at the start of each both horizontal lines at the left hand side. The numbers travel along the lines from left to right and at each node (circle) the numbers at that node are summed together and sent out the other side.

Each line also has an associated "weight", this weight is multiplied with the number traveling along the line before it is sent to the attached node. For example if the numbers (2,2,2) are inputted, and all of the lines have a weight of 2, then the final output will be 192.

This is all I understand so far. The rest I have guessed using the information given in the picture above.

I'm guessing that i is a variable that ranges from 1 to 3, j is a variable that ranges from 1 to 2, and k is a variable that is always 1.

so in the formula

x_{j}=\sum_{i}w_{ij}*I_{i}

x_{j} is the value of the 'j'th hidden node

and

I_{i} is the value of the 'i'th input

e.g I_{1} will be the 1st input (going into the top horizontal line)

w_{ij} will be the "weight" of the line before the node, however I do not understand what to do when there are two subscripts next to it.

So for example the value of the 2nd hidden node will be

x_{2}=\displaystyle\sum\limits_{i=1}^3 w_{i2}*I_{i}

I just do not understand how you evaluate the sum when you have 2 subscripts under a variable as is the case with w_{ij}

can someone please explain?

Or have I completely misunderstood the whole network and I'm getting everything wrong?

Thanks!
 
Last edited by a moderator:
Mathematics news on Phys.org
As I interpret this, i indicates which input layer (1, 2, or 3), and j indicates which hidden layer (1 or 2).

$$\sum_{i = 1}^3 w_{i2} * I_i $$
means w12 * I1 + w22 * I2 + w32 * I3
 
Mark44 said:
means w12 * I1 + w22 * I2 + w32 * I3

But what does w12 mean?

say that the 1st input line has a weight of 3 and the second hidden layer line has a weight of 4 would w12 mean 3*4 ?

so I1 would be multiplied by 12?
 
CraigH said:
But what does w12 mean?

say that the 1st input line has a weight of 3 and the second hidden layer line has a weight of 4 would w12 mean 3*4 ?

so I1 would be multiplied by 12?
I don't think so. There are six lines that join the three input nodes to the two hidden nodes. The weights are associated with the lines, not the nodes. That's my take, anyway, but I could be wrong.
 
The layers don't have weights, the individual connections have. w12 is the weight how input i=1 influences the hidden node j=2.
 
Thanks for your answers mark44/mfb. I'm still not fully sure with this question but i'll come back to it in the morning when I'm less tired and have another go at it. Thanks!
 
mfb said:
e. w12 is the weight how input i=1 influences the hidden node j=2.

ah so w12 is the weight of the line between i=1 and j=2. This seems obvious now. Thank you!
 

Similar threads

Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
5K
Replies
11
Views
3K
  • · Replies 3 ·
Replies
3
Views
5K