- #1
MexChemE
- 237
- 55
- Homework Statement
- Prove that if ##\mathbf{s}## and ##\mathbf{t}## satisfy a homogeneous system then so do these vectors.
a) ##\mathbf{s} +\mathbf{t}##
b) ##3 \mathbf{s}##
c) ##k \mathbf{s} + j \mathbf{t}## for ##k, j \in \mathbb{R}##
- Relevant Equations
- Not applicable
First, a little context. It's been a while since I last posted here. I am a chemical engineer who is currently preparing for grad school, and I've been reviewing linear algebra and multivariable calculus for the last couple of months. I have always been successful at math (at least in the computational way it is taught in engineering school), and I feel I have a nice understanding of calculus through physical/graphical intuition, however, I cannot say the same about linear algebra, so I've been trying to learn linear algebra in a more rigorous manner through Hefferon's textbook. This is the first proof I have ever written, and I know it is very basic, but I was hoping someone could help me review it and maybe give me a few tips (I intend to go through the whole book and do most excercises/proofs).
Now the actual proof:
Let ##\mathbf{s} , \mathbf{t} \in \mathbb{R}^{n}## so that ##\mathbf{s} = (s_1,...,s_n)## and ##\mathbf{t} = (t_1,...,t_n)##.
Let ##A \mathbf{x} = \mathbf{0}## be a homogeneous linear system where ##\mathbf{x} \in \mathbb{R}^{n}## and ##A## is the matrix of coefficients of the system of size ##m \times n##.
Let ##\mathbf{s}## and ##\mathbf{t}## be solutions to the aforementioned system, so that the following statements are true: ##A \mathbf{s} = \mathbf{0}## and ##A \mathbf{t} = \mathbf{0}##.
In other words, for any ##i##-th equation in the system, the following are true:
$$a_{i,1} \ s_1 + ... + a_{i, n} \ s_n = 0$$
$$a_{i,1} \ t_1 + ... + a_{i, n} \ t_n = 0$$
a)
Let ##\left(\mathbf{s} + \mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(\mathbf{s} + \mathbf{t}\right) = (s_1+t_1,...,s_n+t_n)##.
We must show ##\mathbf{s} + \mathbf{t}## also satisfies the system, or in other words, that ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(\mathbf{s} + \mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (s_1+t_1) + ... + a_{i, n} \ (s_n+t_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + (a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##(\mathbf{s} + \mathbf{t})## is a solution to the system and ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true.
b)
Let ##3 \mathbf{s} \in \mathbb{R}^{n}## so that ##3 \mathbf{s} = (3s_1,...,3s_n)##.
We must show ##3 \mathbf{s}## also satisfies the system, or in other words, that ##A (3 \mathbf{s}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##3 \mathbf{s}##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (3 s_1) + ... + a_{i, n} \ (3 s_n) = b_i$$
Here we can see that the number three is a factor in every term, so we can extract it:
$$3 (a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) = b_i$$
But we know that the expression inside parentheses is equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##3 \mathbf{s}## is a solution to the system and ##A (3 \mathbf{s}) = \mathbf{0}## is true.
c)
Let ##\left(k\mathbf{s} + j\mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(k\mathbf{s} + j\mathbf{t}\right) = (ks_1+jt_1,...,ks_n+jt_n)##, where ##k, j \in \mathbb{R}##.
We must show ##k\mathbf{s} + j\mathbf{t}## also satisfies the system, or in other words, that ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(k\mathbf{s} + j\mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (ks_1+jt_1) + ... + a_{i, n} \ (ks_n+jt_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ ks_1 + ... + a_{i, n} \ ks_n) + (a_{i,1} \ jt_1 + ... + a_{i, n} \ jt_n) = b_i$$
We notice ##k## is a factor in every term inside the first parentheses, and ##j## is a factor in every term inside the second parentheses, so we can extract them:
$$k(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + j(a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation and any values of ##k## and ##j##. So ##(k\mathbf{s} + j\mathbf{t})## is a solution to the system and ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true.
QED.
One little comment I would like to make, is that up to this point, the author had defined the scalar multiplication of a vector, but not its distributive property, so I felt a little cheap using it in the proof, but I saw no other way. I also expressed linear systems in terms of matrix-vector multiplication which hasn't been introduced in the book either. Is this considered fair game?
Thanks in advance for any input!
Now the actual proof:
Let ##\mathbf{s} , \mathbf{t} \in \mathbb{R}^{n}## so that ##\mathbf{s} = (s_1,...,s_n)## and ##\mathbf{t} = (t_1,...,t_n)##.
Let ##A \mathbf{x} = \mathbf{0}## be a homogeneous linear system where ##\mathbf{x} \in \mathbb{R}^{n}## and ##A## is the matrix of coefficients of the system of size ##m \times n##.
Let ##\mathbf{s}## and ##\mathbf{t}## be solutions to the aforementioned system, so that the following statements are true: ##A \mathbf{s} = \mathbf{0}## and ##A \mathbf{t} = \mathbf{0}##.
In other words, for any ##i##-th equation in the system, the following are true:
$$a_{i,1} \ s_1 + ... + a_{i, n} \ s_n = 0$$
$$a_{i,1} \ t_1 + ... + a_{i, n} \ t_n = 0$$
a)
Let ##\left(\mathbf{s} + \mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(\mathbf{s} + \mathbf{t}\right) = (s_1+t_1,...,s_n+t_n)##.
We must show ##\mathbf{s} + \mathbf{t}## also satisfies the system, or in other words, that ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(\mathbf{s} + \mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (s_1+t_1) + ... + a_{i, n} \ (s_n+t_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + (a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##(\mathbf{s} + \mathbf{t})## is a solution to the system and ##A (\mathbf{s} + \mathbf{t}) = \mathbf{0}## is true.
b)
Let ##3 \mathbf{s} \in \mathbb{R}^{n}## so that ##3 \mathbf{s} = (3s_1,...,3s_n)##.
We must show ##3 \mathbf{s}## also satisfies the system, or in other words, that ##A (3 \mathbf{s}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##3 \mathbf{s}##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (3 s_1) + ... + a_{i, n} \ (3 s_n) = b_i$$
Here we can see that the number three is a factor in every term, so we can extract it:
$$3 (a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) = b_i$$
But we know that the expression inside parentheses is equal to zero, so ##b_i = 0## for any ##i##-th equation. So ##3 \mathbf{s}## is a solution to the system and ##A (3 \mathbf{s}) = \mathbf{0}## is true.
c)
Let ##\left(k\mathbf{s} + j\mathbf{t}\right) \in \mathbb{R}^{n}## so that ##\left(k\mathbf{s} + j\mathbf{t}\right) = (ks_1+jt_1,...,ks_n+jt_n)##, where ##k, j \in \mathbb{R}##.
We must show ##k\mathbf{s} + j\mathbf{t}## also satisfies the system, or in other words, that ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true. Multiplying matrix ##A## times vector ##(k\mathbf{s} + j\mathbf{t})##, we get ##m## equations, where any ##i##-th equation is:
$$a_{i,1} \ (ks_1+jt_1) + ... + a_{i, n} \ (ks_n+jt_n) = b_i$$
Using the distributive property of the scalar multiplication of a vector, we can regroup terms:
$$(a_{i,1} \ ks_1 + ... + a_{i, n} \ ks_n) + (a_{i,1} \ jt_1 + ... + a_{i, n} \ jt_n) = b_i$$
We notice ##k## is a factor in every term inside the first parentheses, and ##j## is a factor in every term inside the second parentheses, so we can extract them:
$$k(a_{i,1} \ s_1 + ... + a_{i, n} \ s_n) + j(a_{i,1} \ t_1 + ... + a_{i, n} \ t_n) = b_i$$
But we know each of the expressions inside parentheses are equal to zero, so ##b_i = 0## for any ##i##-th equation and any values of ##k## and ##j##. So ##(k\mathbf{s} + j\mathbf{t})## is a solution to the system and ##A (k\mathbf{s} + j\mathbf{t}) = \mathbf{0}## is true.
QED.
One little comment I would like to make, is that up to this point, the author had defined the scalar multiplication of a vector, but not its distributive property, so I felt a little cheap using it in the proof, but I saw no other way. I also expressed linear systems in terms of matrix-vector multiplication which hasn't been introduced in the book either. Is this considered fair game?
Thanks in advance for any input!