Linear Algerba - Finding linearly independent vectors

In summary: If you tried solving for other values, you would get different results. In summary, In this problem, you are trying to find the linearly independent vectors, however, you are not sure how to do this. You tried pulling out a linear book but you couldn't make straight forward sense of the procedure. Can anyone help you out? Thanks!
  • #1
Moomax
7
0
Hi guys,

I am solving a problem in the form: (ATx=0 where A is a matrix of known numbers and I am solving for x. After performing reduction and multiplying ATx, I am left with the following equations:


-X1 + X4 - X5 = 0

-X2 + X4 = 0

-X3 + X4 -X5 + 28X6 = 0

From these equations, I am trying to find the linearly independent vectors however, I am not sure how to do this. I tried pulling out a linear book but I couldn't make straight forward sense of the procedure. Can anyone help me out? Thanks!
 
Physics news on Phys.org
  • #2
I may be missing the scampi for the bees, but those aren't vectors, they're parametric eqn's, which describe surfaces.

Solving these equations is equivalent to finding the intersection of the three surfaces. I count 6 unknowns and 3 eqns, in which case the solution will be a 3D relation.
 
  • #3
I'm not sure where linear independence comes into it but if X1, ..., X6 are the components of your vector then like christianjb says you will only be able to fix a 3D subspace, where all vectors in it satisfy your equation.

If you just need one specific solution, try fixing X1, X2 and X3. You'll then be able to calculate X4, X5 and X6 from your equations.
 
  • #4
Moomax said:
Hi guys,

I am solving a problem in the form: (ATx=0 where A is a matrix of known numbers and I am solving for x. After performing reduction and multiplying ATx, I am left with the following equations:


-X1 + X4 - X5 = 0
So X5= X4- X1

-X2 + X4 = 0
and X2= X4

-X3 + X4 -X5 + 28X6 = 0
X3= X4- X5+ 28X6
= X4- (X4- X1)+ 28X6
= X1+ 28X6

So X5, X2, and x3 can be written in terms of X1, X4, and X6.

Now, take X11= 1, X4= x6= 0. Then X5= -1, X2= 0, and X3= 1. One solution is <1, 0, 1, 0, -1, 0>.

Take X4= 1, X1= X6= 0. Then X5= 1, X2= 1, and X3= 0. Another solution is <0, 1, 0, 1, 1, 0>.

Take X6= 1, X1= X4= 0. Then X5= 0, X2= 0, and X3= 28. A third solution is <0, 0, 28, 0, 0, 1>.

The three vectors, <1, 0, 1, 0, -1, 0>, <0, 1, 0, 1, 1, 0>, and <0, 0, 28, 0, 0, 1> form a basis for the solution space.

From these equations, I am trying to find the linearly independent vectors however, I am not sure how to do this. I tried pulling out a linear book but I couldn't make straight forward sense of the procedure. Can anyone help me out? Thanks!
 
  • #5
HallsofIvy is on the right track with what I mean. But I am still a little bit confused about the actual proceedure.

Like what made you simplify the equations the way you did? When I was playing around with them trying to simpify it never occurred to me to set them equal the way you did.

Also after that was done, what made you pick the variables you did to equal 0 while others equal to 1?
 
  • #6
I was just "solving" the equations. It would be nice if we could solve the equations for specific numbers (in which case the solution space would be the trivial subspace, containing only (0,0,0)) but here we can't. So we do the best we can. Since there are three (independent) equations, we can solve for three values- in terms of the other three. In this problem, ANY three can be solved for in terms of the other three. I just chose the easiest.

As for setting one variable equal to 1 and the others 0, that guarantees that the vectors you get will be independent.
 

What is linear algebra and why is it important?

Linear algebra is a branch of mathematics that deals with linear equations and their representations in vector spaces. It is important because it provides a powerful framework for solving problems in many areas of science and engineering, including physics, computer graphics, and data analysis.

What are linearly independent vectors?

Linearly independent vectors are a set of vectors that cannot be expressed as a linear combination of each other. In other words, no vector in the set can be written as a linear combination of the other vectors. This means that each vector in the set is unique and contributes to the overall span of the vector space.

Why is it important to find linearly independent vectors?

It is important to find linearly independent vectors because they form a basis for a vector space, meaning that they can be used to represent any vector in that space. This is useful in solving systems of linear equations, determining the rank of a matrix, and performing other operations in linear algebra.

How do you find linearly independent vectors?

To find linearly independent vectors, you can use the Gaussian elimination method to reduce a set of vectors into their reduced row-echelon form. The vectors that correspond to the pivot columns in the reduced matrix will be linearly independent.

Can a set of vectors be linearly independent in one vector space but not in another?

Yes, a set of vectors can be linearly independent in one vector space but not in another. This is because the definition of linear independence depends on the dimension of the vector space. A set of vectors that is linearly independent in a three-dimensional space may not be linearly independent in a two-dimensional space.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
878
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
880
  • Linear and Abstract Algebra
Replies
1
Views
892
  • MATLAB, Maple, Mathematica, LaTeX
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
879
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
859
Back
Top