rayzhu52
- 1
- 0
Hi everyone,
My Linear Algebra Professor recently had a lecture on Orthogonal projections.
Say for example, we are given the vectors:
y = [3, -1, 1, 13], v1 = [1, -2, -1, 2] and v2 = [-4, 1, 0, 3]
To find the projection of y, we first check is the set v1 and v2 are orthogonal:
v1 • v2 = -4 -2 + 0 + 6 = 0
So we know the set is orthogonal and we can now find the projection of y, or \hat{y}:
\hat{y} =[(y • v1)/(v1 • v1) * v1)
+ [(y • v2)/(v2 • v2) * v2)]
= some value
Now, we covered what it means when a set is non-orthogonal v1 • vn≠ 0,
but what if we are asked to find \hat{y}?
Any form of help would be greatly appreciated!
My Linear Algebra Professor recently had a lecture on Orthogonal projections.
Say for example, we are given the vectors:
y = [3, -1, 1, 13], v1 = [1, -2, -1, 2] and v2 = [-4, 1, 0, 3]
To find the projection of y, we first check is the set v1 and v2 are orthogonal:
v1 • v2 = -4 -2 + 0 + 6 = 0
So we know the set is orthogonal and we can now find the projection of y, or \hat{y}:
\hat{y} =[(y • v1)/(v1 • v1) * v1)
+ [(y • v2)/(v2 • v2) * v2)]
= some value
Now, we covered what it means when a set is non-orthogonal v1 • vn≠ 0,
but what if we are asked to find \hat{y}?
Any form of help would be greatly appreciated!