Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
Intro Physics Homework Help
Advanced Physics Homework Help
Precalculus Homework Help
Calculus Homework Help
Bio/Chem Homework Help
Engineering Homework Help
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
Intro Physics Homework Help
Advanced Physics Homework Help
Precalculus Homework Help
Calculus Homework Help
Bio/Chem Homework Help
Engineering Homework Help
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Homework Help
Calculus and Beyond Homework Help
Linear Algebra: Augmented matrix echelon form y-space?
Reply to thread
Message
[QUOTE="Ray Vickson, post: 4651874, member: 330118"] Basically, he is doing a form of LU-decomposition. If you start with ##B = [A|I]## (##A## = your original matrix and ##I##= unit matrix), then after some row-reduction steps you end up with ##B_{new} = [U|L]##. Here, ##U## is the usual row- reduced matrix, and ##L## is what happens to the matrix ##I## that you used to augment ##A##. The matrix ##L## will be lower-triangular (unless you performed row-interchanges); it is a fact that ##L \cdot A = U##; in other words, the matrix ##L## encapsulates the row-operations you used to get from ##A## to ##U##. Multiplying by ##L## on the left produces exactly the same results as row-reduction. Why might this be useful? Well, suppose that for some reason you needed to solve several (separate) equations of the form ##Ax = b_1, \: A x = b_2, \: \ldots ##, all having the same left-hand-side but different right-hand-side vectors ##b##. You can use the matrix ##L## to conclude that the equations reduce to the simple, triangular forms ##Ux = Lb_1, \: Ux = L b_2, \ldots ##, each of which is easily solvable by successive evaluation and back-substitution. (It would, of course, be even simpler if you had the inverse ##C = A^{-1}##, but we often try to avoid computing the inverse for reasons of efficiency and numerical stability, etc. The upper-triangular form is almost as fast to deal with.) [/QUOTE]
Insert quotes…
Post reply
Forums
Homework Help
Calculus and Beyond Homework Help
Linear Algebra: Augmented matrix echelon form y-space?
Back
Top