Define independency of vectors

  • Thread starter Thread starter Deimantas
  • Start date Start date
  • Tags Tags
    Vectors
Deimantas
Messages
38
Reaction score
0

Homework Statement



It roughly translates to this: "Define linear dependency of vectors". I hope it makes any sense.

"1, x, sinx; x is defined (-∞, +∞)."

Homework Equations


The Attempt at a Solution



I believe that to solve this problem, I have to turn these vectors into a quadratic polynomial, something like this: "t^2+t+1". However I have no idea what substitution should I use in order to turn it into a polynomial. I checked the common trig substitutions and didn't find anything helpful.
 
Physics news on Phys.org
Linear independency of for example 3 vectors is defined as:
If the following equation only lead to the trivial solution c_{1,2,3} = 0, then the vectors are linearly independent:
c_1\vec{v}_1+c_2\vec{v}_2+c_3\vec{v}_3 = 0
And dependency is the oppsite. So if you are to prove dependency, find 1 nontrivial solution to the equation.
 
But how can I solve such equation

c1(1)+c2(x)+c3(sinx)=0

where vector elements are variables(functions) instead of the usual integers?
I'm pretty sure the vectors will turn out to be independent, but I don't know how to prove it in this case...
 
Deimantas said:
But how can I solve such equation

c1(1)+c2(x)+c3(sinx)=0

where vector elements are variables(functions) instead of the usual integers?
I'm pretty sure the vectors will turn out to be independent, but I don't know how to prove it in this case...

You need to determine if there are constants (c_1,c_2,c_3) \neq (0,0,0) that make the equation c_1 \cdot 1 + c_2 \cdot x + c_3 \cdot \sin x = 0 for all x. Just imagine you could find such ci and then try to work out a contradiction to some known facts.

RGV
 
What is the exact statement of the problem? You say "Define Linear Dependency of Vectors" but that has nothing to do with "1, x, sin(x)". Are you trying to show that 1, x, and sin(x) are independent as members of the space of continuous functions? But neither of those questions has anything to do with polynomials.
 
HallsofIvy said:
What is the exact statement of the problem? You say "Define Linear Dependency of Vectors" but that has nothing to do with "1, x, sin(x)". Are you trying to show that 1, x, and sin(x) are independent as members of the space of continuous functions? But neither of those questions has anything to do with polynomials.

Yes, I may have stated the problem in a wrong way.
Look at this exercise:

Is the system of functions sinx, cos^2(x), 1 independent? The interval of x is (0; 2π).

λ1(sinx)+λ2(cos^2(x))+λ3(1)=0 The interval of x is (0; 2π).

cos^2(x) equals 1-sin^2(x), therefore the equation is

λ1(sinx)-λ2(sin^2(x))+(λ2+λ3)=0 The interval of x is (0; 2π).

let's assume sinx = t, and (λ2+λ3)=α, then the equation is

-λ2(t^2)+λ1(t)+α=0 The interval of t is (-1; 1).

that's a polynomial, and it equals 0 only when all lambdas equal zero, therefore the system of functions is independent.


I was hoping I could do the same with 1, x, sinx somehow, that's why I mentioned polynomial. Though I don't see a way to apply the same technique in my case.
 
Deimantas said:
Yes, I may have stated the problem in a wrong way.
Look at this exercise:

Is the system of functions sinx, cos^2(x), 1 independent? The interval of x is (0; 2π).

λ1(sinx)+λ2(cos^2(x))+λ3(1)=0 The interval of x is (0; 2π).

cos^2(x) equals 1-sin^2(x), therefore the equation is

λ1(sinx)-λ2(sin^2(x))+(λ2+λ3)=0 The interval of x is (0; 2π).

let's assume sinx = t, and (λ2+λ3)=α, then the equation is

-λ2(t^2)+λ1(t)+α=0 The interval of t is (-1; 1).

that's a polynomial, and it equals 0 only when all lambdas equal zero, therefore the system of functions is independent.


I was hoping I could do the same with 1, x, sinx somehow, that's why I mentioned polynomial. Though I don't see a way to apply the same technique in my case.

I already told you in my first response what you need to do. Looking for polynomials would not be helpful at all, and I don't know why you would want to do it.

RGV
 
Ray Vickson said:
I already told you in my first response what you need to do. Looking for polynomials would not be helpful at all, and I don't know why you would want to do it.

RGV

I somehow didn't manage to notice your first post, my apologies. I'll try to work it out, thanks.
 
Ray Vickson said:
You need to determine if there are constants (c_1,c_2,c_3) \neq (0,0,0) that make the equation c_1 \cdot 1 + c_2 \cdot x + c_3 \cdot \sin x = 0 for all x. Just imagine you could find such ci and then try to work out a contradiction to some known facts.

RGV

Well, I couldn't find any c's, other than 0's, that would make that equation work for all x, there are only some unique cases like "x=3π/2,c1=1,c2=0,c3=1". That tells me the functions are linearly independent. But I don't know how to put it in my textbook so that it would satisfy the teacher. How should I have solved the c1(1)+c2(x)+c3(sinx)=0 equation in a reasonable way?
 
  • #10
Deimantas said:
Well, I couldn't find any c's, other than 0's, that would make that equation work for all x, there are only some unique cases like "x=3π/2,c1=1,c2=0,c3=1". That tells me the functions are linearly independent. But I don't know how to put it in my textbook so that it would satisfy the teacher. How should I have solved the c1(1)+c2(x)+c3(sinx)=0 equation in a reasonable way?

IF there are constants c_i, not all zero, that make the above equation into an identity in x, what must happen? Say c_3 ≠ 0; that means we can divide by c_3 to get
\sin(x) = -\frac{c_1}{c_3} - \frac{c_2}{c_3} x \text{ for all } x \in R, and that would mean what? Since we get a ridiculous conclusion, we cannot have c_3 non-zero. Thus, we need c_3 = 0. Now look at c_1 + c_2 x = 0 for all x. What does that say about c_1 and c_2?

Note: the equation may hold for some x, even for a lot of different values of x, but that is not the point. The crucial issue is whether or not it holds for ALL x.

RGV
 
Back
Top