How can the orthonormal basis of four vectors be found?

soopo
Messages
222
Reaction score
0

Homework Statement


How can I find the orthonormal basis of four vectors?

The vectors are:
(0, 3, 0, 4), (4, 0, 3, 0), (4, 3, 3, 4) and (4, -3, 3, -4).

The Attempt at a Solution


I am not sure, whether I should use Gram-Schmidt process or the process of finding
eigenvalues, eigenvectors and then normalizing.

I would like to use the latter method. However, I am not sure whether the
process works here.
 
Last edited:
Physics news on Phys.org
What's the purpose of Gram-Schmidt again? Perhaps you should look it up on Wikipedia...
 
phreak said:
What's the purpose of Gram-Schmidt again? Perhaps you should look it up on Wikipedia...

It is a method for orthogonalizing a set of vectors in an inner product space.
However, the method is rather risky, because it involves many steps. It makes me unsure whether it is the best method or not.

Is it possible to solve the problem with eigenvectors and so determining the orthonormal basis?
 
Put them into a matrix and row reduce it. No eigenvectors involved. It's simpler than that. Once you found a spanning basis, then use Gram-Schmidt.
 
IMO, Gram-Schmidt is much simpler than what you propose with eigenvectors.
 
Dick said:
Put them into a matrix and row reduce it. No eigenvectors involved. It's simpler than that. Once you found a spanning basis, then use Gram-Schmidt.

I have got:

(1 0 1 -1
0 1 1 1
0 0 0 0
0 0 0 0)

This seems to mean that I have to use the Gram-Schmidt process only for two
vectors, like

Select v1 = (1 0 1 -1)

v2 = w2 - (w2 * v1) / (v1 * v1) *v1[/tex]
= w2 - 0
= w2,
where w2 = (0 1 1 1).

Then, normalizing

u1 = \frac {(1\ 0\ 1\ -1)} {\sqrt3}
u2 = \frac {(0\ 1\ 1\ 1)} {\sqrt3}

I can't believe that this is the answer. There may be some mistakes.
 
Well, there's an easy way to check it. Take the dot product of each presumed orthonormal vector with every other presumed orthonormal vector. All dot products should be zero. If this is true, then you're done. In your case, you only have two vectors, so just calculate $\langle u_1,u_2\rangle$.
 
phreak said:
Well, there's an easy way to check it. Take the dot product of each presumed orthonormal vector with every other presumed orthonormal vector. All dot products should be zero. If this is true, then you're done. In your case, you only have two vectors, so just calculate $\langle u_1,u_2\rangle$.

We're done!
All dot products are zero.

Thank you!
 
Yeah, fine. But can you find a linear combination of (1,0,1,-1) and (0,1,1,1) that makes (0,3,0,4)? I don't think so. It think you row reduced it wrong.
 
  • #10
Dick said:
Yeah, fine. But can you find a linear combination of (1,0,1,-1) and (0,1,1,1) that makes (0,3,0,4)? I don't think so. It think you row reduced it wrong.

I cannot find that.
Hmm... I double checked the initial values in my calculator. They are correct. I inserted the initial vectors to my calculator such that one column for one vector. This can be the mistake. However, I don't think I should have each vector as a row.
 
  • #11
I can't speak to calculator problems, but I do know that if you claim that a*(1,0,1,-1) and b*(0,1,1,1) span the space, and (0,3,0,4) is in it then b=3 and a=0. And that doesn't work.
 
  • #12
Dick said:
I can't speak to calculator problems, but I do know that if you claim that a*(1,0,1,-1) and b*(0,1,1,1) span the space, and (0,3,0,4) is in it then b=3 and a=0. And that doesn't work.

I get the same row reduced matrix also by hand.

Do you mean that it is not possible to present the vectors of the basis as a linear combination of the initial vectors?

What does it mean if we cannot present the vectors as a linear combination of the initial vectors?
 
  • #13
soopo said:
I get the same row reduced matrix also by hand.

Do you mean that it is not possible to present the vectors of the basis as a linear combination of the initial vectors?

What does it mean if we cannot present the vectors as a linear combination of the initial vectors?

I think what you are doing is putting the vectors in as columns, doing row operations, then pulling the vectors out as rows. Don't do that. Put them in as ROWS, do row operations, then pull them out as rows.
 
  • #14
Dick said:
I think what you are doing is putting the vectors in as columns, doing row operations, then pulling the vectors out as rows. Don't do that. Put them in as ROWS, do row operations, then pull them out as rows.

Dick said:
I think what you are doing is putting the vectors in as columns, doing row operations, then pulling the vectors out as rows. Don't do that. Put them in as ROWS, do row operations, then pull them out as rows.

I did that like you say.

The answer changes:

The matrix reduces to:
(1 0 3/4 0
0 1 0 4/3
0 0 0 0
0 0 0 0)

Let's use Gram-Schmidt process again:
Select v1 = (1 0 3/4 0)

v2 = w2 - (w2 * v1) / (v1 * v1) *v1
= w2 - 0
= w2

Then, normalizing
u1 = (4/5) [1 0 3/4 0]
u2 = (3/5) [0 1 0 4/3]

This problem raised a few questions.

Dick made previously an elegant error check. Can I compare these formulae to any initial vectors ta make an error check?

Does it matter how we put initial vectors to the matrix? So can we put vectors to the matrix either as rows or columns?
 
Last edited:
  • #15
I think you have a typo, u2=(3/5)[0,1,0,4/3], right? The purpose of the initial row reduction is to reduce the initial four vectors to the two vectors which are linearly independent and span the whole subspace. That gives you a smaller set to apply Gram-Schmidt to. In doing this always put the vectors in and take them out as rows.
 
  • #16
Dick said:
The purpose of the initial row reduction is to reduce the initial four vectors to the two vectors which are linearly independent and span the whole subspace. That gives you a smaller set to apply Gram-Schmidt to. In doing this always put the vectors in and take them out as rows.

Can we always do the row reduction?

For example, I today had a similar problem with the following matrix:
(1 2 0 0
2 1 0 0
0 0 1 2
0 0 2 1)

The question asked me to diagonalize orthogonally the previous symmetric
matrix P and calculate P^{T}AP.

I reduced it to I^{4}. This made me very unsure.
I thought that it cannot be that easy: normalize, put the vectors in P and
calculate P^{T}AP by putting the eigenvalues in.

My answer to P^{T}AP was I^{4}.

I started to think that I should have first put the lambdas to the diagonal,
solve eigenvalues, and then solve the eigenvectors.

At home, I have got different answers with both methods. Now, I am not sure
which method is the correct one.

Please, let me know your opinion.
 
Last edited:
  • #17
That's a completely different problem. No, you can't row reduce every matrix and expect to get the same answer to every problem. The row reduction in the previous problem had NOTHING to do with any matrix. It was just to get a minimal spanning set.
 
  • #18
Dick said:
That's a completely different problem. No, you can't row reduce every matrix and expect to get the same answer to every problem. The row reduction in the previous problem had NOTHING to do with any matrix. It was just to get a minimal spanning set.

So your point is: when we need to find the basis for the matrix, we always want the minimum amount of vectors, by which we can present the other vectors in the initial space. For instance, we presented the four vectors at the start of this post by two vectors.

Your other point is that: only use row reduction initially about basis. For example, in the problems of finding eigenvalues and -vectors, we put the lambdas in and do not row reduce.

Thank you! You really have put me on the right track :)
 
  • #19
Your original set of vectors is not linearly independent. The first and second add up to the third vector, and the first and -1 times the second make the fourth. This is why the row-reduced matrix has only two nonzero rows. Gram-Schmidt only works on a linearly independent list. You can use it on the first two vectors (with or without row-reducing; you'll get the same result). If you try it on the third and fourth vectors, you'll get meaningless or undefined (0/0?) basis vectors.
 

Similar threads

Replies
16
Views
2K
Replies
3
Views
2K
Replies
14
Views
4K
Replies
22
Views
3K
Replies
11
Views
2K
Replies
4
Views
2K
Replies
5
Views
2K
Replies
1
Views
1K
Replies
4
Views
1K
Back
Top