MHB Is $s$ the unique vector that spans the solution space $L(A,0)$?

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Space
Click For Summary
The discussion centers on the uniqueness of the vector \( s \) that spans the solution space \( L(A,0) \) for a matrix \( A \) with rank \( n-1 \). Participants confirm that the determinant expansion leads to the conclusion that \( s \) is uniquely defined and spans the kernel of \( A \), which has a dimension of 1. They explore whether \( s \) could be the zero vector, concluding that while it satisfies the equations, it does not fulfill the requirement for spanning the solution space. The overall consensus is that \( s \) is non-zero and uniquely defines the solution space of the linear system \( A \cdot x = 0 \). The discussion emphasizes the importance of the rank-nullity theorem in establishing these properties.
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

For a field $K$ and $1<n\in \mathbb{N}$ let $A\in K^{(n-1)\times n}$ aa matrix with rank $n-1$. For a row vector $z\in K^{1\times n}$ let $\left (\frac{A}{z}\right )\in K^{n\times n}$ be the matrix that we get if we add as the $n$-th row of the matrix $A$ the vector $z$.

To show that there is a column vector $s=(s_1, \ldots , s_n)^T$ such that for each row vector $z=(z_1, \ldots , z_n)$ it holds $$\det \left [\left (\frac{A}{z}\right )\right ]=\sum_{i=1}^nz_is_i=z\cdot s$$ we consider the laplace formula for the clculation of the determinant. We expand for the last row. For $i\in \{1, \ldots , n\}$ let $A_i$ be the submatrix of$A$ if we remove the $i$-th columnn and $s_i:=(-1)^{i+n}\det A_i$.

Then we get $$\det \left [\left (\frac{A}{z}\right )\right ]=\sum_{i=1}^nz_is_i$$

Is this correct? (Wondering) We wcould also for an other row or column, right?I want to show that the vector $s$ spans the solution space $L(A,0)$ of the linear system of equations $A\cdot x=0$ as a $K$-vector space.

How could we do that? Could you give me a hint? (Wondering)
 
Last edited by a moderator:
Physics news on Phys.org
mathmari said:
To show that there is a column vector $s=(s_1, \ldots , s_n)^T$ such that for each row vector $z=(z_1, \ldots , z_n)$ it holds $$\det \left [\left (\frac{A}{z}\right )\right ]=\sum_{i=1}^nz_is_i=z\cdot s$$ we consider the laplace formula for the clculation of the determinant. We expand for the last row. For $i\in \{1, \ldots , n\}$ let $A_i$ be the submatrix of$A$ if we remove the $i$-th columnn and $s_i:=(-1)^{i+n}\det A_i$.

Then we get $$\det \left [\left (\frac{A}{z}\right )\right ]=\sum_{i=1}^nz_is_i$$

Is this correct?

Hey mathmari!

Yep.

mathmari said:
We wcould also for an other row or column, right?

Didn't we already do it for all rows and columns? (Thinking)

mathmari said:
I want to show that the vector $s$ spans the solution space $L(A,0)$ of the linear system of equations $A\cdot x=0$ as a $K$-vector space.

How could we do that? Could you give me a hint?

What is the rank of the solution space $L(A,0)$?
Is $\mathbf s$ unique and non-zero?
What is $A\cdot\mathbf s$? (Wondering)
 
I like Serena said:
Didn't we already do it for all rows and columns? (Thinking)

What do you mean? (Wondering)
I like Serena said:
What is the rank of the solution space $L(A,0)$?
Is $\mathbf s$ unique and non-zero?
What is $A\cdot\mathbf s$? (Wondering)

The solution space is equal the kernel of$A$, right?

$s$ is uniquely defined.

We have that $\text{Rank}(A)=n-1$ so from the formula of dimensions we get that $\dim (\ker(A))=1$. That means that the basis of the solution space contains only one element.

If $z$ is an arbitrary row of $A$, $z=a_i, \forall i$, then $a_i\cdot s=0$. That means that $A\cdot s=0$, so $s\in \ker (A)$, and so $s$ is contained in the soution space.

From that we get the desired result, right? (Wondering)
 
mathmari said:
What do you mean?

Didn't we find each of the $s_i$ by working through each of the columns?
And by using all rows to find the sub determinant $\det(A_i)$? (Wondering)

mathmari said:
The solution space is equal the kernel of$A$, right?

$s$ is uniquely defined.

We have that $\text{Rank}(A)=n-1$ so from the formula of dimensions we get that $\dim (\ker(A))=1$. That means that the basis of the solution space contains only one element.

If $z$ is an arbitrary row of $A$, $z=a_i, \forall i$, then $a_i\cdot s=0$. That means that $A\cdot s=0$, so $s\in \ker (A)$, and so $s$ is contained in the soution space.

From that we get the desired result, right?

Suppose $\mathbf s$ is the zero vector. Then all these statements are true, but we still don't get the desired result do we? (Wondering)

How did you find that $\mathbf s$ is unique?
You did find that there is at least one $\mathbf s$, but there could still be more, couldn't there? (Wondering)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K