jdstokes
- 520
- 1
Hi all,
I learned this stuff years ago and wasn't brilliant at it even then so I think a refresher is in order.
Suppose I have n distinct homogeneous equations in n unknowns. I want to find the solution so I write down the matrix of coefficients multiplying my vector of variables as follows
A \mathbf{x} =\mathbf{0}.
Now, we don't want \deta A \neq 0 to happen otherwise the columns of A are linearly independent so the only solution to A \mathbf{x} = \mathbf{C}_1 x_1 + \cdots \mathbf{C}_n x_n = \mathbf{0} is \mathbf{0}.
Now how do we actually solve this for \mathbf{x}, do we just do Gaussian elimination followed by back-substitution? Is the solution unique in this case?
Now suppose the system is inhomogeneous
A\mathbf{x} = \mathbf{b} where \mathbf{b}\neq 0. In this case we actually want \det A \neq 0 because then we can instantly write down the unique solution
\mathbf{x} = A^{-1}\mathbf{b}.
Have I gotten the solution to square systems about right? If yes, I'll try to figure out the non-square case.
I learned this stuff years ago and wasn't brilliant at it even then so I think a refresher is in order.
Suppose I have n distinct homogeneous equations in n unknowns. I want to find the solution so I write down the matrix of coefficients multiplying my vector of variables as follows
A \mathbf{x} =\mathbf{0}.
Now, we don't want \deta A \neq 0 to happen otherwise the columns of A are linearly independent so the only solution to A \mathbf{x} = \mathbf{C}_1 x_1 + \cdots \mathbf{C}_n x_n = \mathbf{0} is \mathbf{0}.
Now how do we actually solve this for \mathbf{x}, do we just do Gaussian elimination followed by back-substitution? Is the solution unique in this case?
Now suppose the system is inhomogeneous
A\mathbf{x} = \mathbf{b} where \mathbf{b}\neq 0. In this case we actually want \det A \neq 0 because then we can instantly write down the unique solution
\mathbf{x} = A^{-1}\mathbf{b}.
Have I gotten the solution to square systems about right? If yes, I'll try to figure out the non-square case.