My first proof ever - Linear algebra

Click For Summary

Homework Help Overview

The discussion revolves around a proof in linear algebra concerning homogeneous linear systems. The original poster, a chemical engineering student preparing for graduate school, is attempting to rigorously understand linear algebra concepts through Hefferon's textbook. They present a proof to show that if two vectors are solutions to a homogeneous system, then their sum, a scalar multiple, and a linear combination of these vectors are also solutions.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster outlines their proof in three parts: demonstrating that the sum of two solutions, a scalar multiple of a solution, and a linear combination of two solutions also satisfy the homogeneous system. They express uncertainty about using the distributive property of scalar multiplication and matrix-vector multiplication, which they feel has not been introduced in their textbook.

Discussion Status

Participants are reviewing the proof and providing feedback. The original poster is seeking tips for improvement and clarification on the appropriateness of their approach, particularly regarding the use of concepts not yet covered in their studies.

Contextual Notes

The original poster notes that they are transitioning from computational methods of learning mathematics to a more rigorous understanding, which may influence their confidence in applying certain mathematical properties in their proof.

MexChemE
Messages
237
Reaction score
54
Homework Statement
Prove that if ##\mathbf{s}## and ##\mathbf{t}## satisfy a homogeneous system then so do these vectors.
a) ##\mathbf{s} +\mathbf{t}##
b) ##3 \mathbf{s}##
c) ##k \mathbf{s} + j \mathbf{t}## for ##k, j \in \mathbb{R}##
Relevant Equations
Not applicable
First, a little context. It's been a while since I last posted here. I am a chemical engineer who is currently preparing for grad school, and I've been reviewing linear algebra and multivariable calculus for the last couple of months. I have always been successful at math (at least in the computational way it is taught in engineering school), and I feel I have a nice understanding of calculus through physical/graphical intuition, however, I cannot say the same about linear algebra, so I've been trying to learn linear algebra in a more rigorous manner through Hefferon's textbook. This is the first proof I have ever written, and I know it is very basic, but I was hoping someone could help me review it and maybe give me a few tips (I intend to go through the whole book and do most excercises/proofs).

Now the actual proof:

Let ##\mathbf{s} , \mathbf{t} \in \mathbb{R}^{n}## so that ##\mathbf{s} = (s_1,...,s_n)## and ##\mathbf{t} = (t_1,...,t_n)##.
Let ##A \mathbf{x} = \mathbf{0}## be a homogeneous linear system where ##\mathbf{x} \in \mathbb{R}^{n}## and ##A## is the matrix of coefficients of the system of size ##m \times n##.
Let ##\mathbf{s}## and ##\mathbf{t}## be solutions to the aforementioned system, so that the following statements are true: ##A \mathbf{s} = \mathbf{0}## and ##A \mathbf{t} = \mathbf{0}##.
In other words, for any ##i##-th equation in the system, the following are true:
$$a_{i,1} \ s_1 + ... + a_{i, n} \ s_n = 0$$
$$a_{i,1} \ t_1 + ... + a_{i, n} \ t_n = 0$$

a)
Let ##\left(\mathbf{s} + \mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(\mathbf{s} + \mathbf{t}\right) = (s_1+t_1,...,s_n+t_n)##.
We must show ##\mathbf{s} + \mathbf{t}## also satisfies the system, or in other words, that ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(\mathbf{s} + \mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (s_1+t_1) + ... + a_{i, n} \ (s_n+t_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + (a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##(\mathbf{s} + \mathbf{t})## is a solution to the system and ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true.

b)
Let ##3 \mathbf{s} \in \mathbb{R}^{n}## so that ##3 \mathbf{s} = (3s_1,...,3s_n)##.
We must show ##3 \mathbf{s}## also satisfies the system, or in other words, that ##A (3 \mathbf{s}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##3 \mathbf{s}##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (3 s_1) + ... + a_{i, n} \ (3 s_n) = b_i$$
Here we can see that the number three is a factor in every term, so we can extract it:
$$3 (a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) = b_i$$
But we know that the expression inside parentheses is equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##3 \mathbf{s}## is a solution to the system and ##A (3 \mathbf{s}) = \mathbf{0}## is true.

c)
Let ##\left(k\mathbf{s} + j\mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(k\mathbf{s} + j\mathbf{t}\right) = (ks_1+jt_1,...,ks_n+jt_n)##, where ##k, j \in \mathbb{R}##.
We must show ##k\mathbf{s} + j\mathbf{t}## also satisfies the system, or in other words, that ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(k\mathbf{s} + j\mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (ks_1+jt_1) + ... + a_{i, n} \ (ks_n+jt_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ ks_1 + ... + a_{i, n} \ ks_n) + (a_{i,1} \ jt_1 + ... + a_{i, n} \ jt_n) = b_i$$
We notice ##k## is a factor in every term inside the first parentheses, and ##j## is a factor in every term inside the second parentheses, so we can extract them:
$$k(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + j(a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation and any values of ##k## and ##j##. So ##(k\mathbf{s} + j\mathbf{t})## is a solution to the system and ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true.

QED.

One little comment I would like to make, is that up to this point, the author had defined the scalar multiplication of a vector, but not its distributive property, so I felt a little cheap using it in the proof, but I saw no other way. I also expressed linear systems in terms of matrix-vector multiplication which hasn't been introduced in the book either. Is this considered fair game?

Thanks in advance for any input!
 
Physics news on Phys.org
MexChemE said:
Homework Statement:: Prove that if s and t satisfy a homogeneous system then so do these vectors.
a) s+t
b) 3s
c) ks+jt for k,j∈R
Relevant Equations:: Not applicable

First, a little context. It's been a while since I last posted here. I am a chemical engineer who is currently preparing for grad school, and I've been reviewing linear algebra and multivariable calculus for the last couple of months. I have always been successful at math (at least in the computational way it is taught in engineering school), and I feel I have a nice understanding of calculus through physical/graphical intuition, however, I cannot say the same about linear algebra, so I've been trying to learn linear algebra in a more rigorous manner through Hefferon's textbook. This is the first proof I have ever written, and I know it is very basic, but I was hoping someone could help me review it and maybe give me a few tips (I intend to go through the whole book and do most excercises/proofs).

Now the actual proof:

Let s,t∈Rn so that s=(s1,...,sn) and t=(t1,...,tn).
Let Ax=0 be a homogeneous linear system where x∈Rn and A is the matrix of coefficients of the system of size m×n.
Let s and t be solutions to the aforementioned system, so that the following statements are true: As=0 and At=0.
In other words, for any i-th equation in the system, the following are true:
ai,1 s1+...+ai,n sn=0
ai,1 t1+...+ai,n tn=0

a)
Let (s+t)∈Rn so that (s+t)=(s1+t1,...,sn+tn).
We must show s+t also satisfies the system, or in other words, that A(s+t)=0 is true. Multiplying matrix A times vector (s+t), we get m equations, where any i-th equation is:
ai,1 (s1+t1)+...+ai,n (sn+tn)=bi
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
(ai,1 s1+...+ai,n sn)+(ai,1 t1+...+ai,n tn)=bi
But we know each of the expressions inside parentheses are equal to zero, so bi=0 for any i-th equation. So (s+t) is a solution to the system and A(s+t)=0 is true.

b)
Let 3s∈Rn so that 3s=(3s1,...,3sn).
We must show 3s also satisfies the system, or in other words, that A(3s)=0 is true. Multiplying matrix A times vector 3s, we get m equations, where any i-th equation is:
ai,1 (3s1)+...+ai,n (3sn)=bi
Here we can see that the number three is a factor in every term, so we can extract it:
3(ai,1 s1+...+ai,n sn)=bi
But we know that the expression inside parentheses is equal to zero, so bi=0 for any i-th equation. So 3s is a solution to the system and A(3s)=0 is true.

c)
Let (ks+jt)∈Rn so that (ks+jt)=(ks1+jt1,...,ksn+jtn), where k,j∈R.
We must show ks+jt also satisfies the system, or in other words, that A(ks+jt)=0 is true. Multiplying matrix A times vector (ks+jt), we get m equations, where any i-th equation is:
ai,1 (ks1+jt1)+...+ai,n (ksn+jtn)=bi
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
(ai,1 ks1+...+ai,n ksn)+(ai,1 jt1+...+ai,n jtn)=bi
We notice k is a factor in every term inside the first parentheses, and j is a factor in every term inside the second parentheses, so we can extract them:
k(ai,1 s1+...+ai,n sn)+j(ai,1 t1+...+ai,n tn)=bi
But we know each of the expressions inside parentheses are equal to zero, so bi=0 for any i-th equation and any values of k and j. So (ks+jt) is a solution to the system and A(ks+jt)=0 is true.

QED.

One little comment I would like to make, is that up to this point, the author had defined the scalar multiplication of a vector, but not its distributive property, so I felt a little cheap using it in the proof, but I saw no other way. I also expressed linear systems in terms of matrix-vector multiplication which hasn't been introduced in the book either. Is this considered fair game?

Thanks in advance for any input!
For part a), you can use the properties of matrix multiplication.
Given that s and t are solutions of the equation Ax = 0,
A(s + t) = As + At = 0 + 0 = 0
and similar for the other two parts.

However, based on your final comment, I'm not sure that the above would be allowed.
It seems strange to me, though, that you're working with vectors in ##\mathbb R^n##, but the basic properties of a vector space haven't been presented yet or that matrix operations haven't been presented. I'm not familiar with the book you're using, so I'm at a loss here.
 
Mark44 said:
However, based on your final comment, I'm not sure that the above would be allowed.
It seems strange to me, though, that you're working with vectors in ##\mathbb R^n##, but the basic properties of a vector space haven't been presented yet or that matrix operations haven't been presented.
Actually, the ##\mathbb{R}^{n}## part was mine too, it seems I used my previous knowledge from computational linear algebra as a crutch, but of course, I could have defined my vectors as having ##n## components, without necessarily saying they are in ##\mathbb{R}^{n}##.
Mark44 said:
I'm not familiar with the book you're using, so I'm at a loss here.
The book is an open source textbook from American professor Jim Hefferon. He also has lectures on YouTube supporting the textbook, which I usually watch before reading through the corresponding section. You can find more info about it here. I decided to give it a try as I've seen many people recommend it as an alternative to books on the level of Strang or Friedberg.
 
MexChemE said:
Prove that if ##\mathbf{s}## and ##\mathbf{t}## satisfy a homogeneous system then so do these vectors.
a) ##\mathbf{s} +\mathbf{t}##
b) ##3 \mathbf{s}##
c) ##k \mathbf{s} + j \mathbf{t}## for ##k, j \in \mathbb{R}##
I believe most lin. algebra textbooks would consider the short proof I did in my previous post to be sufficient.
For part c, I think something like this would be fine:
A(ks + jt) = A(ks) + A(jt) = kA(s) + jA(t) = k*0 + j*0 = 0
Therefore ks + jt is a solution to the homogeneous equation.
I have used some basic properties of multiplication of a vector by a matrix. See https://en.wikipedia.org/wiki/Matrix_multiplication, under General Properties.
 
  • Like
Likes   Reactions: MexChemE
Alright, thanks a lot!
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K