Projections and direct sum

In summary: Hi, StoneTemplePython. Thanks so much for your help. (I think) I eventually figured these problems out with a little help.In your post you bring artillery into play that I don't quite have yet. In my course, we've covered projections as a build-up to diagonalization and prior to inner product spaces, or even defining an orthogonal complement, even though I have loose ideas of those concepts.I eventually reached the conclusion that the projection for each subspace maps the basis vector of that subspace to itself, and the other basis vectors to ##0v##. I can be more specific if necessary. I also was able to figure out the question posed in my other thread.However, I have another (also very basic
  • #1
iJake
41
0

Homework Statement



Let ##V = \mathbb{R}^4##. Consider the following subspaces:
##V_1 = \{(x,y,z,t)\ : x = y = z\}, V_2=[(2,1,1,1)], V_3 =[(2,2,1,1)]##

And let ##V = M_n(\mathbb{k})##. Consider the following subspaces:

##V_1 = \{(a_{ij}) \in V : a_{ij} = 0,\forall i < j\}##

##V_2 = \{(a_{ij}) \in V : a_{ij} = 0,\forall i > j\}##

##V_3 = \{(a_{ij}) \in V: a_{ij} = 0, \forall i \neq j\}##

Prove that for both statements it is true that ##V = V_1 \oplus V_2 \oplus V_3##

Find the projections associated to both decompositions.

Homework Equations



Projections are linear transformations where ##P^2 = P##.

The Attempt at a Solution



So for both it was fairly trivial to prove the direct sum. To be explicit about my method, in the first case I took the basis of ##V_1## to be ##\{(1,0,0,0),(0,0,0,1)\}## (from ##\{(1,1,1,0),(0,0,0,1)\}## given by the definition of the subspace) and for ##V_2, V_3## the given vectors. I used these as columns, performed matrix operations to reach row echelon form, and thereby discovered that I could form a diagonal matrix and that each column was linearly independent.

From there, the sum clearly gave me a general vector in ##V##, and by the definition of their linear independence the intersection between the spaces is necessarily ##\{0\}##. Therefore we have a direct sum in the first case.

In the second case, the basis of ##V_1## is a matrix with entries of ##1## above the diagonal and ##0## along and below the diagonal, ##V_2## is ##0## along and above the diagonal and ##1## below, and the basis of ##V_3## is a diagonal matrix with ##0## above and below the diagonal.

Again the direct sum is clear.

Where assistance is needed

However, I do not know how to find the projections associated to these decompositions. I do know the definition of a a projection. Is there some intuitive way to see what it would be like here or in similar cases, or do I need to apply the formula ##P_x = A(A^TA)^{-1}A^T##? If so, could I see an example from the first case?
 
Physics news on Phys.org
  • #2
iJake said:

Homework Statement



Let ##V = \mathbb{R}^4##. Consider the following subspaces:
##V_1 = \{(x,y,z,t)\ : x = y = z\}, V_2=[(2,1,1,1)], V_3 =[(2,2,1,1)]##
...

Where assistance is needed

However, I do not know how to find the projections associated to these decompositions. I do know the definition of a a projection. Is there some intuitive way to see what it would be like here or in similar cases, or do I need to apply the formula ##P_x = A(A^TA)^{-1}A^T##? If so, could I see an example from the first case?

You seem to have put a good deal of work in here. For the first problem, it should remind you a lot of diagonalizing a matrix, except all eigenvalues are 1, and you are partitioning that diagonalized matrix into 3 different matrices.
- - - -
Hopefully I'm reading this right... you want 3 projectors to tripartition your vector space. Note that it's common to interpret projections as bi-partitioning a vector space -- but if you like, you can use associativity to split a vector space into one subspace you want and a residual, then bipartition the residual into the 2 that you wan and 1 + 2 =3 of course.
- - - -
For case one you have ##P_1## being rank two, ##P_2## as rank one and ##P_3## as rank one. Note that there would be an easier and more satisfying approach if ##\mathbf v_2 \perp \mathbf v_3## -- this would allow symmetric projectors and and simpler approach. Anyway you need two more linearly independent vectors ##\mathbf v_0## and ##\mathbf v_1## to form the below matrix

##\mathbf V := \bigg[\begin{array}{c|c|c|c}
\mathbf v_0 & \mathbf v_1 & \mathbf v_{2} & \mathbf v_3\end{array}\bigg]##

compute
##
\mathbf S := \mathbf V^{-1} =
\begin{bmatrix}
\tilde{ \mathbf s_0}^T \\
\tilde{ \mathbf s_1}^T \\
\tilde{ \mathbf s}_{2}^T \\
\tilde{ \mathbf s_3}^T
\end{bmatrix}
##so

## \mathbf I = \mathbf {V S}=
\mathbf v_0 \tilde{ \mathbf s_0}^T + \mathbf v_1 \tilde{ \mathbf s_1}^T + \mathbf v_2 \tilde{ \mathbf s_2}^T + \mathbf v_3 \tilde{ \mathbf s_3}^T = \Big( \mathbf v_0 \tilde{ \mathbf s_0}^T + \mathbf v_1 \tilde{ \mathbf s_1}^T\Big) + \Big(\mathbf v_2 \tilde{ \mathbf s_2}^T \Big) + \Big(\mathbf v_3 \tilde{ \mathbf s_3}^T\Big) = \mathbf P_1 + \mathbf P_2 + \mathbf P_3##
which should look familiar...
- - - -
now your second problem seems to be treating matrices as a vector space and partitioning them into strictly upper triangular, strictly lower triangular and diagonal. The fact that the first two are nilpotent I think holds some clues, though I've never given a problem of this sort too much thought. It feels like Hadamard Product territory to me, but that is not a linear transformation.

- - - -
edit: you have another recent thread on projections. I have some ideas there too, though basically I'd derive everything via the minimal polynomial implied by ##P^2 = P##.
 
Last edited:
  • #3
Hi, StoneTemplePython. Thanks so much for your help. (I think) I eventually figured these problems out with a little help.

In your post you bring artillery into play that I don't quite have yet. In my course, we've covered projections as a build-up to diagonalization and prior to inner product spaces, or even defining an orthogonal complement, even though I have loose ideas of those concepts.

I eventually reached the conclusion that the projection for each subspace maps the basis vector of that subspace to itself, and the other basis vectors to ##0v##. I can be more specific if necessary. I also was able to figure out the question posed in my other thread.

However, I have another (also very basic) question about projections. If you don't mind, rather than clutter this thread with another question, I could make another post for it or send you a PM (I'm not sure if it deserves its own thread). If you're alright with a PM, I'll send it after a response to this post.
 
  • #4
iJake said:
Hi, StoneTemplePython. Thanks so much for your help. (I think) I eventually figured these problems out with a little help.

In your post you bring artillery into play that I don't quite have yet. In my course, we've covered projections as a build-up to diagonalization and prior to inner product spaces, or even defining an orthogonal complement, even though I have loose ideas of those concepts.

To be certain, nothing in my post actually uses diagonalization or inner products or orthogonal anything (complements or otherwise). I did mention ##\mathbf v_2 \perp \mathbf v_3## could allow a nicer result (inner product related) but your problem doesn't have that, so it's a side point. (There are some other considerations too, but it starts with having ##\mathbf v_2 \perp \mathbf v_3##.)

Also, I only mentioned diagonalization, such that if you already know it, perhaps from other courses, it can give you an intuitive feel for what's going on mechanically. The suggested intuitive feel is optional!

The only thing needed is (a) the outerproduct / rank one update interpretation of matrix matrix multiplication and (b) to be able to figure out how to find linearly independent vectors to form a basis, given that you already have ##\mathbf v_2## and ##\mathbf v_3##. This is important but not something that everyone knows. I did a post on it a while back:

https://www.physicsforums.com/threads/span-of-a-set-of-vectors.930301/#post-5874737
- - - -
edit: one thing that I couldn't quite tell from your post was the requirement to "find the projections associated...". Did that mean you could just write an abstract form for what is happening or did you need to be a bit more concrete, say, and compute the actual matrix for say part 1 -- i.e. actually find and compute a qualifying ##\mathbf v_0## and ##\mathbf v_1##
- - - -
I suppose I did mention minimal polynomial in my edit though. That would typically be more of an advanced topic, though it's very nice and easy in the special case of idempotence.
iJake said:
I eventually reached the conclusion that the projection for each subspace maps the basis vector of that subspace to itself, and the other basis vectors to ##0v##. I can be more specific if necessary. I also was able to figure out the question posed in my other thread.

Excellent. I am curious as to your answer for part two of your question here as I haven't thought much about projections acting on matrices as a vector space...
iJake said:
However, I have another (also very basic) question about projections. If you don't mind, rather than clutter this thread with another question, I could make another post for it or send you a PM (I'm not sure if it deserves its own thread). If you're alright with a PM, I'll send it after a response to this post.

A PM is ok, but if there is any meat to the question, it may be better to post in HW or Lin Algebra forum so that other people can benefit from it as well.
 
Last edited:
  • #5
In regards to your edit, I don't know how to calculate such a matrix. Could you explain to me?

Regarding my answer to the second part, I just took a basis of 1 entries for each subspace (ie strictly upper triang, lower triang, diagonal) and said the projection for each subspace maps that basis to itself and the basis vectors of the other two to the zero vector. Maybe that was too abstract, in hindsight?
 
  • #6
iJake said:
In regards to your edit, I don't know how to calculate such a matrix. Could you explain to me?
which matrix are you not sure how to calculate, this one?

##\mathbf V := \bigg[\begin{array}{c|c|c|c}
\mathbf v_0 & \mathbf v_1 & \mathbf v_{2} & \mathbf v_3\end{array}\bigg]
##

if so, then take a look at the blue link in post number 4, just above that edit.

##\mathbf S## is jut inverting ##\mathbf V## so it's conceptually easy, though I wouldn't bother doing 4x4 inverses by hand.

iJake said:
Regarding my answer to the second part, I just took a basis of 1 entries for each subspace (ie strictly upper triang, lower triang, diagonal) and said the projection for each subspace maps that basis to itself and the basis vectors of the other two to the zero vector. Maybe that was too abstract, in hindsight?

It really depends on what your teacher / text want you to do. If you are interested in Matrix Theory / Matrix Analysis, optimization, computing / modelling problems elsewhere with matrices, etc. then it generally pays dividends to de-abstract this a bit and find (at least) one matrix / concrete mapping that does what you've said.

- - - -
nitpick: note that strictly triangular is how I read it for the first two originally. But shouldn't the inequalities be non-strict in the setup to your problem? E.g. your original post says
##V_1 = \{(a_{ij}) \in V : a_{ij} = 0,\forall i < j\}##
but shouldn't it be
##V_1 = \{(a_{ij}) \in V : a_{ij} = 0,\forall i \leq j\}##

as written it seems like all three projections leave the diagonal of the matrix intact and hence not directly summable. Maybe I'm over-reading this though.

- - - -
For better or worse, the direct way that read part 2 of your question is: (using ##n## x ##n## matrices, but setting ##n=4## for illustration)

##\mathbf A = \begin{bmatrix}
a_{1,1} & a_{1,2} & a_{1,3} & a_{1,4}\\
a_{2,1} & a_{2,2} & a_{2,3} & a_{2,4}\\
a_{3,1} & a_{3,2} & a_{3,3} & a_{3,4}\\
a_{4,1} & a_{4,2} & a_{4,3} & a_{4,4}
\end{bmatrix}##

##\mathbf P_1 =
\left[\begin{matrix}0 & 0 & 0 & 0\\1 & 0 & 0 & 0\\1 & 1 & 0 & 0\\1 & 1 & 1 & 0\end{matrix}\right]##

##\mathbf P_2 =
\left[\begin{matrix}0 & 1 & 1 & 1\\0 & 0 & 1 & 1\\0 & 0 & 0 & 1\\0 & 0 & 0 & 0\end{matrix}\right]##

##\mathbf P_3 =
\left[\begin{matrix}1 & 0 & 0 & 0\\0 & 1 & 0 & 0\\0 & 0 & 1 & 0\\0 & 0 & 0 & 1\end{matrix}\right]##

the operation being performed then is the Hadamard product (sometimes called Schur Product) which is element-wise multiplication denoted by ##\circ##

you can then verify that, where ##\mathbf J## is the matrix filled entirely with ones, which is the identity element with respect to Hadamard products, you have

## \mathbf A = \mathbf J \circ \mathbf A = \big( \mathbf P_1 + \mathbf P_2 + \mathbf P_3\big) \circ \mathbf A = \mathbf P_1 \circ \mathbf A + \mathbf P_2 \circ \mathbf A + \mathbf P_3 \circ \mathbf A ##

You can also verify, for example, that ##\mathbf P_1 \circ \mathbf P_1 = \mathbf P_1##, ditto for the others, which is your idempotence relation.

- - -
Note that these do not behave how you might expect though. To give a nice example that does use some more advanced artillery, consider for example a real symmetric positive definite ##\mathbf A## that is not a diagonal matrix. Then you have

##\det\big(\mathbf A\big) = \det\big(\mathbf P_3\big)\circ \det\big(\mathbf A\big) \lt \det\big(\mathbf P_3 \circ \mathbf A\big) = \prod_{k=1}^n a_{k,k}##

which is the Hadamard inequality
- - - -
This is a curious little niche of matrix theory. Maybe it is better not to dwell too much here.
 
Last edited:

What is a projection?

A projection is a mathematical operation that maps a vector onto a subspace, preserving the original direction and magnitude of the vector. It essentially "projects" the vector onto the subspace.

What is the purpose of a projection?

Projections are useful in many areas of mathematics and science, including linear algebra and geometry. They can be used to simplify complex problems and make calculations easier. In particular, they are important in understanding and analyzing vector spaces and their properties.

What is a direct sum?

In mathematics, a direct sum is a way of combining two or more vector spaces to create a larger vector space. It is denoted by a plus sign inside parentheses, such as V ⊕ W. This operation is also known as the direct sum of vector spaces.

How is a projection related to a direct sum?

A projection is closely related to a direct sum, as a projection can be thought of as a special case of a direct sum. When projecting a vector onto a subspace, we are essentially finding the component of the vector that lies within the subspace, which is similar to taking the direct sum of the vector space and the subspace.

What are some real-world applications of projections and direct sums?

Projections and direct sums have many practical applications, including image and signal processing, data analysis, and computer graphics. For example, in image processing, projections can be used to remove noise or enhance specific features in an image. Direct sums are also used in physics and engineering to model complex systems and analyze the behavior of different components.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
601
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
  • Calculus and Beyond Homework Help
2
Replies
45
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
896
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
451
Back
Top