Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Mathematics
Linear and Abstract Algebra
Orthogonal matrix construction
Reply to thread
Message
[QUOTE="StoneTemplePython, post: 5940652, member: 613025"] edit: had some ideas on polar decomposition but they ended up solving something similar but not quite what you were looking for. round 2: this is actually fairly straight forward. ## \bar{B}_2## is p x m where ##p\gt m## (i.e. tall and skinny matrix) and its rank is ##p##. This means there are ##p-m## non-zero vectors in its left nullspace -- grab them. Now generate ##m## linearly independent vectors via random number generator over [-1,1], run gramm schmidt and you have your answer. - - - - An equivalent, more thorough, and faster process, would be to append ##p-m## randomly generated column vectors (and these will be linearly independent -- in abstraction: with probability 1, in practice with floats, absurdly close to probability 1) such that you now have the augmented square matrix ##C := \bigg[\begin{array}{c|c|c|c|c} \bar{B}_2 & \mathbf v_{m+1} &\cdots & \mathbf v_{p-1} &\mathbf v_{p} \end{array}\bigg]## run QR factorization, ##C = Q_C R = \bigg[\begin{array}{c|c|c|c|c} Q_{\bar{B}_2} & \mathbf q_{m+1} &\cdots & \mathbf q_{p-1} &\mathbf q_{p} \end{array}\bigg]R ## permute the columns of ##Q## such that the vectors associated with the ones you generated at random are now to the left most ##Q_C P = \bigg[\begin{array}{c|c|c|c|c} \mathbf q_{m+1} &\cdots & \mathbf q_{p-1} &\mathbf q_{p} &Q_{\bar{B}_2} \end{array}\bigg] ## where ## P## is a permutation matrix. You then find that ##T^{-1} = T^T = \big(Q_C P\big)^T##. That should do it. I.e. if you work it through, you'll see ## T^{-1} \bar{B}_2 = \big(Q_C P\big)^T \bar{B}_2 = \bigg[\begin{array}{c|c|c|c|c} \mathbf q_{m+1} &\cdots & \mathbf q_{p-1} &\mathbf q_{p} &Q_{\bar{B}_2} \end{array}\bigg]^T \bar{B}_2 = \begin{bmatrix} { \mathbf q_{m+1}}^T \\ \vdots\\ { \mathbf q}_{p}^T \\ Q_{\bar{B}_2}^T \end{bmatrix} \bar{B}_2 = \begin{bmatrix} \mathbf 0_p \mathbf 0_m^T\\ R_{m} \end{bmatrix}## where ##R_m## is the top left m x m principal submatrix of ##R##, i.e. ##\bar{B}_2 = Q_{\bar{B}_2} R_{m}##. If for some reason you don't like having the upper triangular matrix on the right hand side, you could further left multiply each side by ##\begin{bmatrix} S_p^T & \mathbf 0_p \mathbf 0_m^T\\ \mathbf 0_m \mathbf 0_p^T & U_m^T \end{bmatrix}## where ##S_p## is a p x p orthogonal matrix and ##U_m## is an m x m orthogonal matrix, each of your choosing. [/QUOTE]
Insert quotes…
Post reply
Forums
Mathematics
Linear and Abstract Algebra
Orthogonal matrix construction
Back
Top