Projection Matrix: Expressing Operator with Vectors

In summary: The matrix for the projection operator is ##P=\vec{e}\vec{e}^T##, where ##\vec{e}## is a basis vector.
  • #1
Robin04
260
16
If we express the projection operator with vectors, we get ##\hat{P}\vec{v} = \vec{e}(\vec{e}\vec{v})## which means that we project ##\vec{v}## onto ##\vec{e}##. We can write this as ##\hat{P}\vec{v} = e_k \sum_{l} e_lv_l = \sum_l (e_ke_l )v_l##. In my class we said that the matrix for the projection operator is ##P_ {kl}=e_ke_l##, so ##\hat{P}\vec{v}=\sum_l P_{kl} v_l##. But isn't ##e_ke_l## equal to ##\delta_{kl}##?
 
Physics news on Phys.org
  • #2
Robin04 said:
If we express the projection operator with vectors, we get ##\hat{P}\vec{v} = \vec{e}(\vec{e}\vec{v})## which means that we project ##\vec{v}## onto ##\vec{e}##. We can write this as ##\hat{P}\vec{v} = e_k \sum_{l} e_lv_l = \sum_l (e_ke_l )v_l##. In my class we said that the matrix for the projection operator is ##P_ {kl}=e_ke_l##, so ##\hat{P}\vec{v}=\sum_l P_{kl} v_l##. But isn't ##e_ke_l## equal to ##\delta_{kl}##?
Only in case the ##e_i=(\delta_{ij})_j##. If we project onto a basis vector, we will get its component ##v_k##, so? And what is ##\vec{e}##, i.e. which coordinates does it have?
 
  • #3
fresh_42 said:
Only in case the ##e_i=(\delta_{ij})_j##. If we project onto a basis vector, we will get its component ##v_k##, so? And what is ##\vec{e}##, i.e. which coordinates does it have?
I realized what I misunderstood: I thought that ##e_ke_l## is the scalar product of the basis vectors, but they're are components of a single basis vector into which we project. But how can ##P_{kl}## be ##\delta_{kl}##? I can't imagine it. You said it is only in the case when ##e_i=(\delta{ij})_j##, but what does the last ##j## index mean?
 
  • #4
Robin04 said:
I realized what I misunderstood: I thought that ##e_ke_l## is the scalar product of the basis vectors,...
... which I think it exactly is ...
... but they're are components of a single basis vector into which we project.
Not sure what you mean, as you haven't said what ##e_k## are, or what ##\vec{e}## should be. You have a general formula plus a calculation with coordinates without saying what your vectors according to this basis are. So there is necessarily guesswork going on.
But how can ##P_{kl}## be ##\delta_{kl}##? I can't imagine it. You said it is only in the case when ##e_i=(\delta{ij})_j##, but what does the last ##j## index mean?
Say we have an ##n-##dimensional vector space with basis vectors ##(1,0,\ldots,0)\, , \,(0,1,0,\ldots, 0) \, , \,\ldots## Then ##(\delta_{ij})_j = (\delta_{ij})_{1\leq j\leq n}= (\delta_{i1},\ldots,\delta_{ij},\ldots,\delta_{in})## is another way to write them. I suspect that the ##e_k## are exactly those vectors: ##e_k=(\delta_{kl})_l## where ##k## is fixed and ##l## runs from ##1## to ##n##. Of course this still doesn't explain what ##\vec{e}## is - one of them or ##\vec{e}=\sum_j c_je_j\,?##
 
  • #5
I believe, that it should be ##P=\vec{e}\vec{e}^T##, where ##\vec{e}## is a basis vector, e.g.
$$\vec{e}=\begin{pmatrix}1 \\ 0 \\ 0 \end{pmatrix}, \quad \vec{e}^T=\begin{pmatrix}1 & 0 & 0 \end{pmatrix} \rightarrow P=\begin{pmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$$.
 
  • Like
Likes Robin04
  • #6
Robin04 said:
If we express the projection operator with vectors, we get ##\hat{P}\vec{v} = \vec{e}(\vec{e}\vec{v})## which means that we project ##\vec{v}## onto ##\vec{e}##. We can write this as ##\hat{P}\vec{v} = e_k \sum_{l} e_lv_l = \sum_l (e_ke_l )v_l##. In my class we said that the matrix for the projection operator is ##P_ {kl}=e_ke_l##, so ##\hat{P}\vec{v}=\sum_l P_{kl} v_l##. But isn't ##e_ke_l## equal to ##\delta_{kl}##?
Are you projecting onto the axes? You may project onto lines, planes, etc. and not necessarily orthogonally .
 
  • #7
eys_physics said:
I believe, that it should be ##P=\vec{e}\vec{e}^T##, where ##\vec{e}## is a basis vector, e.g.
$$\vec{e}=\begin{pmatrix}1 \\ 0 \\ 0 \end{pmatrix}, \quad \vec{e}^T=\begin{pmatrix}1 & 0 & 0 \end{pmatrix} \rightarrow P=\begin{pmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$$.
Yes, that's what I missed, thank you! We also called this dyadic product.

fresh_42 said:
... which I think it exactly is ...
Not sure what you mean, as you haven't said what ##e_k## are, or what ##\vec{e}## should be. You have a general formula plus a calculation with coordinates without saying what your vectors according to this basis are. So there is necessarily guesswork going on.

We project onto a line given by the vector ##\vec{e}## and ##e_k## are its components.

WWGD said:
Are you projecting onto the axes? You may project onto lines, planes, etc. and not necessarily orthogonally .

Yes, we learned those too, I just had some trouble understanding this particular case.
 
Last edited:
  • #8
Robin04 said:
We also called this dyadic product.
A dyadic (outer) product is the tensor product of two vectors: ##u\otimes v##. In coordinates you can achieve the result as the matrix multiplication column times row: ##u \cdot v^\tau.## It is necessarily a matrix of rank one.
 
  • Like
Likes Robin04

What is a projection matrix?

A projection matrix is a square matrix that when multiplied with a vector, projects the vector onto a subspace. It essentially compresses the vector onto a lower dimensional space.

How is a projection matrix expressed?

A projection matrix is typically expressed as P = A(ATA)-1AT, where A is a matrix containing the basis vectors of the subspace.

What is the purpose of a projection matrix?

The purpose of a projection matrix is to express a linear operator in terms of vectors. It helps in simplifying calculations and making it easier to analyze and manipulate the operator.

What are the properties of a projection matrix?

A projection matrix is idempotent, meaning that when multiplied with itself, it results in the same matrix. It is also symmetric and its eigenvalues are either 0 or 1.

How is a projection matrix used in real-world applications?

Projection matrices are widely used in computer graphics, particularly in 3D rendering, to project objects onto a 2D screen. They are also used in machine learning for dimensionality reduction and data compression.

Similar threads

Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
983
  • Linear and Abstract Algebra
Replies
5
Views
2K
Replies
5
Views
990
  • Electromagnetism
Replies
1
Views
748
  • Classical Physics
Replies
5
Views
2K
Replies
2
Views
755
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
875
  • Quantum Physics
Replies
1
Views
774
Back
Top