Subspace Orthogonality in Ax=b

In summary: I'm sorry all, I got carried away and mixed up columns/rows =/COLUMNSare vertical. Think of the columns at the front of a building R O W S are horizontal.I meant in terms of nxm vs mxn, but whatever.First number is number of rows, next is number of columns. Long ago I had trouble remembering this, but I came up with a mnemomic device - RC Cola (not sure they still make that soft drink). R comes first, then C.That's wonderful! (The southern United States would not be the same without RC Cola! (And moon-pies.))I meant in
  • #1
JohnPrior3
17
5
Let A be the matrix
[2 0 1 0
1 -1 4 3
3 -1 5 3]

Let b= [b1 b2 b3] transpose

What equation must be satisfied by the components of b in order to guarantee that there will exists a vector x= [x1 x2 x3 x4] transpose
satisfying the equation Ax=b. Justify your answer.

I know C(A) is the orthogonal component of the left nullspace. Also I know that the rank of the matrix of is 2 and b must belong to the column space. Where do I go from here? This is not a homework question, I am simply trying to get a better understanding of the subspaces. I am a much better learner for things with physical applications and these ideas have been far to abstract for my liking.
 
Physics news on Phys.org
  • #2
I used my limited superpowers to move your post to a homework forum because the rules tell us to treat all textbook-style problems as homework, even when the problem wasn't given as homework.
 
  • #3
JohnPrior3 said:
Let A be the matrix
[2 0 1 0
1 -1 4 3
3 -1 5 3]

Let b= [b1 b2 b3] transpose

What equation must be satisfied by the components of b in order to guarantee that there will exists a vector x= [x1 x2 x3 x4] transpose
satisfying the equation Ax=b. Justify your answer.

I know C(A) is the orthogonal component of the left nullspace. Also I know that the rank of the matrix of is 2 and b must belong to the column space. Where do I go from here? This is not a homework question, I am simply trying to get a better understanding of the subspaces. I am a much better learner for things with physical applications and these ideas have been far to abstract for my liking.
You're making this much more complicated than it deserves.

Set up an augmented matrix whose first four columns are the columns of A, and whose last column is your vector b.
$$\begin{bmatrix} 2 & 0 & 1 & 0 & | & b_1 \\
1 & -1 & 4 & 3 & | & b_2 \\
3 & -1 & 5 & 3 & | & b_3 \end{bmatrix} $$

Use row reduction to get the matrix on the left in reduced row-echelon form. As it turns out, one of the rows of the matrix turns into all zeroes. How does that affect the entry in the fifth column of this same row?
 
  • #4
I don't think you want the transpose on the b vector.
 
  • #5
BiGyElLoWhAt said:
I don't think you want the transpose on the b vector.
Not sure who your reply is directed to.

The OP wrote "Let b= [b1 b2 b3] transpose" to represent this
$$\vec{b} = \begin{bmatrix} b_1 \\ b_2 \\ b_3\end{bmatrix}$$
A is 3 X 4, x is 4 X 1, so the result b will be 3 X 1, a column vector with three components. One way to write this that doesn't use LaTeX is <b1, b2, b3>T.
 
Last edited:
  • #6
Mark44 said:
One way to write this that doesn't use LaTeX is <b1, b2, b3>T.
A bit off topic, but I always think of (x,y,z) (written with commas between the components) as the transpose of (x y z). Now I just need to get the rest of the world to use my conventions. o0)
 
  • #7
I'm sorry all, I got carried away and mixed up columns/rows =/
 
  • #8
BiGyElLoWhAt said:
I'm sorry all, I got carried away and mixed up columns/rows =/
C
O
L
U
M
N
S
are vertical. Think of the columns at the front of a building

R O W S are horizontal.
 
  • #9
Fredrik said:
A bit off topic, but I always think of (x,y,z) (written with commas between the components) as the transpose of (x y z). Now I just need to get the rest of the world to use my conventions. o0)
I think that's going to be a hard sell...
 
  • #10
Mark44 said:
C
O
L
U
M
N
S
are vertical. Think of the columns at the front of a building

R O W S are horizontal.
I meant in terms of nxm vs mxn, but whatever.
 
  • #11
BiGyElLoWhAt said:
I meant in terms of nxm vs mxn, but whatever.
First number is number of rows, next is number of columns. Long ago I had trouble remembering this, but I came up with a mnemomic device - RC Cola (not sure they still make that soft drink). R comes first, then C.
 
  • Like
Likes Fredrik
  • #12
Mark44 said:
First number is number of rows, next is number of columns. Long ago I had trouble remembering this, but I came up with a mnemomic device - RC Cola (not sure they still make that soft drink). R comes first, then C.
That's wonderful! (The southern United States would not be the same without RC Cola! (And moon-pies.))
 
  • #13
Fredrik said:
I used my limited superpowers to move your post to a homework forum because the rules tell us to treat all textbook-style problems as homework, even when the problem wasn't given as homework.


ImageUploadedByPhysics Forums1413585581.303393.jpg


I've reduced it to this augmented matrix. Now what?
 
  • #14
JohnPrior3 said:
View attachment 74549

I've reduced it to this augmented matrix. Now what?
Your work looks fine. Now, translate the augmented matrix into a system of equations. For example, the first row represents 1x1 + 0x2 + (1/2)x3 + 0x4 = b1/2.
What do the other two rows represent?

BTW, when you're working with augmented matrices, it's a good idea to draw a vertical dashed line to separate the coefficients of your matrix from the column that represents the vector b.
 
  • #15
Mark44 said:
I think that's going to be a hard sell...
Yes, it's impossible to get the world to settle on a notational convention, even when there's a good reason to prefer it over the alternative.

3×1 matrices and 1×3 matrices are both elements of ##\mathbb R^3##. The only difference between them is the rule we use to multiply them with other matrices. ##(x,y,z)## (or if you prefer ##\langle x,y,z\rangle##) is a standard notation for an element of ##\mathbb R^3##. So ##(x,y,z)## is both a 3×1 matrix and a 1×3 matrix. This means that it's wrong to say that one of the equalities
$$(x,y,z)=\begin{pmatrix}x & y & z\end{pmatrix},\qquad (x,y,z)=\begin{pmatrix}x & y & z\end{pmatrix}^T$$ is right and the other wrong. If one of these is to be considered an acceptable abuse of notation, it's only a matter of figuring out which one irritates us the least.

The main reason to prefer the latter equality over the former is that if T is a linear operator on ##\mathbb R^3## and [T] is the matrix corresponding to T, then according to the former convention, the matrix equation that corresponds to ##T(x,y,z)=(a,b,c)## is
$$[T]\begin{pmatrix}x\\ y\\ z\end{pmatrix}=\begin{pmatrix}a\\ b\\ c\end{pmatrix},$$ and according to the latter convention, it's
$$\begin{pmatrix}x & y & z\end{pmatrix}[T]=\begin{pmatrix}a & b & c\end{pmatrix}.$$ (The [T] in the latter equation is actually the transpose of the [T] in the former. I didn't want to make the notation even uglier when we can just change the definition of [T]). So my abuse of notation preserves the order of the function and its argument, while yours reverses it.
 

1. What is subspace orthogonality in Ax=b?

Subspace orthogonality in Ax=b refers to the property of a set of vectors in a matrix equation that allows for the solution to be found through the use of orthogonal projections. This means that the vectors in the matrix are perpendicular to each other, allowing for a unique solution to be obtained.

2. How does subspace orthogonality affect the solution of Ax=b?

Subspace orthogonality greatly simplifies the solution of Ax=b, as it allows for the use of orthogonal projections. This means that the solution can be obtained by finding the projection of the vector b onto the subspace spanned by the columns of the matrix A.

3. Can subspace orthogonality be applied to any matrix equation?

No, subspace orthogonality can only be applied to square matrices with linearly independent columns. This is because the columns of the matrix must form a basis for the subspace in order for the orthogonal projections to be possible.

4. How is subspace orthogonality related to the concept of linear independence?

Subspace orthogonality is closely related to linear independence, as it requires the columns of the matrix to be linearly independent in order for the solution to be unique. Additionally, the orthogonal projections used in subspace orthogonality rely on the linear independence of the columns.

5. Are there any real-world applications of subspace orthogonality in Ax=b?

Yes, subspace orthogonality in Ax=b has many real-world applications, particularly in fields such as engineering and physics. It is commonly used in signal processing and data compression, as well as in solving systems of linear equations in electrical circuit analysis and structural engineering.

Similar threads

  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
3K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
24
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
Back
Top