How can A-orthogonal vector sets be determined using positive definite matrices?

  • Thread starter eckiller
  • Start date
  • Tags
    Set Vector
In summary, the conversation discusses the process of finding a set of A-orthogonal vectors given a positive definite matrix A. The method involves using a Cholesky decomposition and projecting the first vector onto the second and subtracting off the projection. It is also mentioned that the book shows how to do this using the projection method.
  • #1
eckiller
44
0
This is a follow up to a post I made a couple days ago.

Basically, I needed to find a set of a-orthogonal vectors given that A is positive definite.

Is the following satisfactory?

Pick the standard basis B = {e1, ..., en}.

Then consider ei' A ej such that i != j.

Since A is positive definite, A can be factored as A = L'L.

Then (ei' L')(L ej)

However, for all ei and ej s.t. i != j,

(ei' L')(L ej) = 0

ei' A ej = 0

<ei, Aej> = 0

So I have determined an A-orthogonal set.
 
Physics news on Phys.org
  • #2
I made a mistake here:

(ei' L')(L ej) = 0


But I still need help.
 
  • #3
the idea is to project the first vectore onto thes econd and subtract off the projection, so that what remains is ortogonal. so you have to review how to project using a dot products.
 
  • #4
Thank you for your reply. However, I am suppose to show this with a Cholesky decomposition. The book shows how to do it w/ the projection method.

I use ' for transpose.

ei is the ith vector of standard basis. i != j

ei' ej = 0

ei' (L inv(L)) ej = 0

... ?

ei' L L' ej = 0

ei' A ej = 0

<ei, Aej> = 0

I'm having trouble filling in "?"
 
Last edited:
  • #5
Nevermind, I figured it out...I was trying to prove the wrong thing. Rather, I was trying to prove something that is not true.
 

What is an A-orthogonal vector set?

An A-orthogonal vector set is a collection of vectors in a vector space where each pair of vectors is orthogonal to each other with respect to a given matrix A. This means that the inner product of any two vectors in the set is equal to zero when multiplied by the matrix A.

What is the significance of having an A-orthogonal vector set?

Having an A-orthogonal vector set allows for easier computation and analysis in linear algebra. It simplifies calculations involving matrix multiplication and can help in solving systems of equations or finding eigenvectors and eigenvalues.

How is an A-orthogonal vector set different from an orthogonal vector set?

An A-orthogonal vector set is similar to an orthogonal vector set, but the difference lies in the fact that the vectors are orthogonal with respect to a specific matrix A. In an orthogonal vector set, the vectors are orthogonal to each other without any reference to a matrix.

Can an A-orthogonal vector set be linearly dependent?

Yes, an A-orthogonal vector set can still be linearly dependent. The vectors may be orthogonal to each other, but they can still be expressed as linear combinations of each other, resulting in dependence.

How do you find an A-orthogonal vector set?

To find an A-orthogonal vector set, one can use the Gram-Schmidt process. This involves taking a set of linearly independent vectors and transforming them into an orthogonal set by using the given matrix A. The resulting set will be an A-orthogonal vector set.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
858
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
563
  • Linear and Abstract Algebra
Replies
4
Views
910
  • Linear and Abstract Algebra
Replies
2
Views
951
  • Calculus and Beyond Homework Help
Replies
8
Views
669
  • Calculus and Beyond Homework Help
Replies
2
Views
5K
  • Linear and Abstract Algebra
Replies
11
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
2K
Back
Top