3.141592654
- 85
- 0
^{}This may be a silly question, but if I have an 8x3 matrix X, for example, then the rows of this matrix will span R3 (and will be linearly dependent). When we find the solution to:
Xw=t
where t is an 8x1 matrix of t's. Then each row can be represented as
w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} = t.
Each row then forms a unique plane in R3, correct? Does the matrix Xw form a plane? I'm learning about Perceptrons, a form of Artificial Neural Network, in which each row of data is classified as either
y^{'} = +1 or -1 depending on if w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} > t or w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} < t.
The book states that in the above situation, "The perceptron model [in the example above] is linear in its parameters w and x. Because of this, the decision boundary of a perceptron, which is obtained by setting y^{'}=0, is a linear hyperplane that separates the data into two classes, -1 and +1."
I'm having a really hard time understanding what this quote is trying to say, because I don't see how Xw=0 "forms a hyperplane" in x-space.
Xw=t
where t is an 8x1 matrix of t's. Then each row can be represented as
w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} = t.
Each row then forms a unique plane in R3, correct? Does the matrix Xw form a plane? I'm learning about Perceptrons, a form of Artificial Neural Network, in which each row of data is classified as either
y^{'} = +1 or -1 depending on if w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} > t or w_{1}x_{i1}+w_{2}x_{i2}+w_{3}x_{i3} < t.
The book states that in the above situation, "The perceptron model [in the example above] is linear in its parameters w and x. Because of this, the decision boundary of a perceptron, which is obtained by setting y^{'}=0, is a linear hyperplane that separates the data into two classes, -1 and +1."
I'm having a really hard time understanding what this quote is trying to say, because I don't see how Xw=0 "forms a hyperplane" in x-space.