Is the Set of Points (x, y, 3x + 2y) a Subspace of a Homogeneous System?

Ryker
Messages
1,080
Reaction score
2

Homework Statement


Suppose you have points of a specific form, say (x, y, 3x + 2y). Show that this set of points is a solution to a homogeneous system of linear equations, hence a subspace.

The Attempt at a Solution


I'm wondering how one is able to go about this. Here's my try, but I'm not sure I'm going about this the right way. I assumed vectors u = (x1, y1, 3x1 + 2y1) and v = (x2, y2, 3x2 + 2y2) are two solutions. Next, I said any linear combination of them, say, tu + sv is also a solution to Ax = 0, where x is the solution vector. Then I just expanded the terms and showed that indeed any solution to Ax = 0 as a linear combination of u and v still retains the form (x, y, 3x + 2y).

Is this the proper way of doing this or is there another one? Should I have perhaps instead written down, for example, a1x + a2y + a3(3x + 2y) = 0, expanded it to (a1 + 3a3)x + (a2+2a3)y = 0, and then resolved this equation to show that the solutions for a's involve parameters?
 
Physics news on Phys.org


the points given by (x, y, 3x + 2y) represents a plane through the origin and satisfies 3x + 2y - z = 0 which is a homogenous linear equation
 


Hmm, yeah, you're right, I didn't think of it that way. But are you sure this is the proper way of formally proving it?

edit: Looking at it again, your solution would then be similar to the second one I proposed, right? I mean, whereas I would get parameters for a's, they could correspond to 3, 2 and -1 in your solution, as well as to multiple other coefficient solutions, such as 6, 4, -2 etc. Could you perhaps elaborate on that a bit?

And could anyone comment on my proposed first solution? Is it wrong, or just another way of going about it?
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top