# Proving Linear Independence of u, v and w

• annoymage
In summary: To prove this statement, you can use the definition of linear independence, which states that a set of vectors is linearly independent if the only solution to the equation au + bv + cw = 0 is a = b = c = 0. So, to show that the statement is true, you just need to show that the only possible solution to the equation is a = b = c = 0, which you can do by substitution or any other method of solving systems of equations. To answer your other question, yes, the way you showed it is correct. You can use elementary row operations to show that if a is an eigenvalue of matrix A, it is also an eigenvalue of matrix B for some row operation on
annoymage

## Homework Statement

"if u,v,w are linearly independent then au+bv+cw=0 for some a,b,c in R"

first of all is this correct?

and how do i proof if this is correct?

should i show like this,

i know 0u+0v+0w=0

so that imply that statement is true?

because 0 is in R

is this correct? to prove "for some" statement?

annoymage said:

## Homework Statement

"if u,v,w are linearly independent then au+bv+cw=0 for some a,b,c in R"
It's true. To show that it's true, look at the definition of linear independence. A similar statement, "if u,v,w are linearly dependent then au+bv+cw=0 for some a,b,c in R", is also true.
annoymage said:
first of all is this correct?

and how do i proof if this is correct?

should i show like this,

i know 0u+0v+0w=0
The equation above is true whether u, v, and w are linearly independent or linearly dependent
annoymage said:
so that imply that statement is true?

because 0 is in R

is this correct? to prove "for some" statement?

i see.

but (this is other question), what if it ask to show that this statement is true "If ~~~~ then ~~~~ for some ~~~"

i'm confused

example

"if a is the eigenvalue of matrix A then a is also eigenvalue for matrix B for some elementary row operation on A to B"

i don't know if my grammar is correct. anyway here's example

let A be 2x2 matrices

1. reduce A by changing row 1 to row 2
2. again reduce A by changing row 1 to row 2

that (1) and (2) are ERO so B have the same eigenvalue with A because B=A

so is that implying that "if a is the eigenvalue of matrix A then a is also eigenvalue for matrix B for some elementary row operation on A to B" is true?

is that the way to prove "for some" statement? hep T_T

Let's take another look at your original question, "if u,v,w are linearly independent then au+bv+cw=0 for some a,b,c in R"

What this means is that, if u, v, and w are linearly independent, then the equation au + bv + cw = 0 has at least one solution for the constants a, b, and c.

It's also true, but not stated above, that for three linearly independent vectors, there is exactly one such solution; namely, a = b = c = 0.

## 1. What does it mean for u, v and w to be linearly independent?

Linear independence refers to a set of vectors that cannot be expressed as a linear combination of one another. In other words, no vector in the set can be written as a linear combination of the other vectors. This means that each vector in the set brings something unique to the table and is necessary to span the entire vector space.

## 2. How do I prove that u, v and w are linearly independent?

To prove that u, v and w are linearly independent, you can use the linear dependence test. This involves setting up a linear combination of the three vectors, with coefficients a, b, and c, and setting it equal to the zero vector. If the only solution to this equation is a=0, b=0, and c=0, then the vectors are linearly independent.

## 3. Can u, v and w be linearly independent in a higher-dimensional space?

Yes, u, v and w can be linearly independent in any vector space, regardless of the dimension. The concept of linear independence applies to any number of vectors in any vector space. However, the process of proving linear independence may become more complex in higher-dimensional spaces.

## 4. Are there any other methods for proving linear independence?

Yes, there are other methods for proving linear independence, such as the determinant test and the reduced row echelon form test. These methods involve using matrices and their properties to determine if the vectors are linearly independent. However, the linear dependence test is the most commonly used method.

## 5. Why is proving linear independence important in mathematics and science?

Proving linear independence is important in mathematics and science because it allows us to understand the relationships between vectors and the structure of vector spaces. It is also a fundamental concept in linear algebra and has many practical applications, such as in solving systems of equations and determining the basis of a vector space.

• Calculus and Beyond Homework Help
Replies
1
Views
534
• Calculus and Beyond Homework Help
Replies
2
Views
2K
• Calculus and Beyond Homework Help
Replies
8
Views
855
• Calculus and Beyond Homework Help
Replies
7
Views
617
• Calculus and Beyond Homework Help
Replies
12
Views
2K
• Calculus and Beyond Homework Help
Replies
1
Views
800
• Calculus and Beyond Homework Help
Replies
2
Views
2K
• Calculus and Beyond Homework Help
Replies
4
Views
771
• Calculus and Beyond Homework Help
Replies
2
Views
1K
• Calculus and Beyond Homework Help
Replies
15
Views
1K