Can linear algebra convert the minimum column vector in a matrix?

  • Context: Undergrad 
  • Thread starter Thread starter tobinare
  • Start date Start date
  • Tags Tags
    Column Minimum Vector
Click For Summary

Discussion Overview

The discussion revolves around the possibility of transforming a matrix's column vectors to achieve a minimum length of 1 through linear algebra techniques. Participants explore methods related to eigenvectors and matrix normalization, particularly in the context of regression analysis.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant inquires about methods to convert an eigenmatrix to ensure all column vectors are minimized, using examples of two matrices with differing column vector lengths.
  • Another participant suggests dividing the matrix by its determinant to achieve a determinant of 1, which may lead to an orthonormal basis if the matrix is orthogonal, but expresses uncertainty about achieving the desired column lengths if the matrix is not orthogonal.
  • A later reply proposes the idea of normalizing vectors by dividing each vector by its norm, providing calculations for the norms of the column vectors in Matrix B and demonstrating how this normalization could yield vectors closer to those in Matrix A.
  • There is a mention of the potential for infinite linearly dependent eigenvectors, suggesting that normalizing could be a valid approach if the specific application allows for such flexibility.
  • One participant asks for clarification on the purpose of the transformation, indicating that the relationship between the vectors and the matrix may influence the approach taken.

Areas of Agreement / Disagreement

Participants express differing views on the methods to achieve the desired transformation of the matrix. While some propose normalization as a solution, others raise questions about the implications and requirements of the transformation, indicating that the discussion remains unresolved.

Contextual Notes

Participants note that the effectiveness of the proposed methods may depend on the properties of the matrices involved, such as orthogonality and the specific relationships between the vectors and the matrices.

tobinare
Messages
2
Reaction score
0
I'm doing a particular form of regression that gives a eigenvector and matrix for the solution. I'm using two matrix libraries to code the solution. One library yields the minimum column vector in each column. The other does not, is there a method in linear algebra to convert the eigenmatrix to one where all the column vectors are minimized?

As an example, all the column vectors in Matrix A are 1. In Matrix B, the 2nd column vector is 2 and I think I require it to be 1. Is this change possible? TIA


Vectors | Matrix - A
---------|-----------------
0.0365 | 0.27217 0.22176
0.0001 | -0.96225 0.97510

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.5026
0.0001 | -0.9624 1.9018
 
Physics news on Phys.org
tobinare said:
I'm doing a particular form of regression that gives a eigenvector and matrix for the solution. I'm using two matrix libraries to code the solution. One library yields the minimum column vector in each column. The other does not, is there a method in linear algebra to convert the eigenmatrix to one where all the column vectors are minimized?

As an example, all the column vectors in Matrix A are 1. In Matrix B, the 2nd column vector is 2 and I think I require it to be 1. Is this change possible? TIA

Vectors | Matrix - A
---------|-----------------
0.0365 | 0.27217 0.22176
0.0001 | -0.96225 0.97510

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.5026
0.0001 | -0.9624 1.9018

Hey tobinare and welcome to the forums.

What I would do is take your matrix and divide by the determinant so that you get a determinant of 1 in the matrix.

This will transform your matrix to a complete orthonormal basis if the matrix itself has an orthogonal basis, but if it's not orthogonal I don't know for sure if you will get your requirement that the length of each column vector is approximately 1.

Also one idea is that if you are able to do a transformation between two bases, then you can use a transformation that takes your basis, orthogonalizes, and from that you can then get the determinant and divide it so that each of the basis vectors will be 1. Since it is orthonormal, you will get the properties of a rotation matrix.

What you can do then is you can place a constraint on your decomposition that the non-orthgonal part has column vectors that must be one and then extract out the rest of the conservation of your matrix (i.e. the equality) so that the rest of it takes that into account.

So I'd look into determinants, matrix norms, matrix decompositions including the gram-schmidt processes as well as using the decompositions and change of basis with regards to orthonormal basis vectors, where you can guarantee that taking the determinant out (scalar divide) will always give you vectors that have length 1.
 
Is normalizing vectors the answer to my question? Do I simply divide each vector by it's normal? So in my case with Matrix B

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.5026
0.0001 | -0.9624 1.9018

The normal for the first column vector is..
sqrt(0.2715^2 + -0.9624^2) = 1.000
The normal for the second column vector is..
sqrt(0.5026^2 + 1.9018^2) = 1.9671

The minimum vector for the second column is
Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715/1.00 0.5026/1.9671
0.0001 | -0.9624/1.00 1.9018/1.9671

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.2553
0.0001 | -0.9624 0.9668

Which is close to
Vectors | Matrix - A
---------|-----------------
0.0365 | 0.27217 0.22176
0.0001 | -0.96225 0.97510
 
Last edited:
tobinare said:
Is normalizing vectors the answer to my question? Do I simply divide each vector by it's normal? So in my case with Matrix B

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.5026
0.0001 | -0.9624 1.9018

The normal for the first column vector is..
sqrt(0.2715^2 + -0.9624^2) = 1.000
The normal for the second column vector is..
sqrt(0.5026^2 + 1.9018^2) = 1.9671

The minimum vector for the second column is
Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715/1.00 0.5026/1.9671
0.0001 | -0.9624/1.00 1.9018/1.9671

Vectors | Matrix - B
---------|-----------------
0.0365 | 0.2715 0.2553
0.0001 | -0.9624 0.9668

Which is close to
Vectors | Matrix - A
---------|-----------------
0.0365 | 0.27217 0.22176
0.0001 | -0.96225 0.97510

If these are just eigenvectors, then you might as well do that since you can have an infinite number of linearly dependent (i.e. parallel) eigenvectors and one such vector will be of length 1.

But it would be a good idea if you tell us what you are using this for. Is there a specific relationship between the vectors and the matrix that needs to hold?
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
15K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
9K
Replies
8
Views
7K