Linear Algebra - Linearly Independent Functions

In summary: Thanks for your help ahead of time, and sorry about for my lack of latex knowledge.The first method is problematic because the vectors aren't orthogonal. You can't assume that a vector would be the sum of its projections onto the basis vectors. Ah, that makes sense. I thought it was right because in my head i was doing examples in my head in R3, so for example (1,1,1) is the sum of its projections on the standard basis. But then I realized that didn't work in reverse.
  • #1
explorer58
6
0
Forgive me ahead of time, I don't really know how to use LaTeX, (it's on my to do list).

Homework Statement


Given the vector space C([0,pi]) of continuous, real valued functions on the given interval, as well as the inner product <f,g>=integral(f(t)*g(t))dt from 0 to pi:

a) Prove the set S={cost, sint, 1, t} is linearly independent.

b) Apply the Gram-Schmidt process to S to get an orthonormal basis of Span(S)


Homework Equations


proj=(<u,v>/||v||²)*v


The Attempt at a Solution


I get that (b) is easy, just a plug and chug.

My question is about a. Of course, any set (and specifically this set for argument's sake) is linearly independent if and only if

a*cost+b*sint+c*1+d*t=0, only for (a,b,c,d)=(0,0,0,0)

Or equivalently, if any of the vectors can be expressed as a linear combination of the others. For example:

cost=a*sint+b*1+c*t

So, I need the best way to prove they are linearly independent. Two ideas passed through my mind. The first (which I'm not positive is correct) is that the vector would be the sum of its projections on the other vectors. So

a=<cost,sint>/<sint,sint> * sint
b=<cost,1>/<1,1> * 1

etc.

And the other idea was to evaluate the functions and different values of t, and prove that a,b, and c could not possible be constants, and therefore the functions are not linear combinations of each other.

Are either of these the right way to go about this?

Thanks for your help ahead of time, and sorry about for my lack of latex knowledge.
 
Physics news on Phys.org
  • #2
Your second method is good.
 
  • #3
So then the first one is bad?

Also, would I need to do it for all vectors? Or would it suffice to show it for just one?

Thanks.
 
Last edited:
  • #4
The first method is problematic because the vectors aren't orthogonal. You can't assume that a vector would be the sum of its projections onto the basis vectors.
 
  • #5
Ah, that makes sense. I thought it was right because in my head i was doing examples in my head in R3, so for example (1,1,1) is the sum of its projections on the standard basis. But then I realized that didn't work in reverse.

Thanks!
 
  • #6
explorer58 said:
Forgive me ahead of time, I don't really know how to use LaTeX, (it's on my to do list).

Homework Statement


Given the vector space C([0,pi]) of continuous, real valued functions on the given interval, as well as the inner product <f,g>=integral(f(t)*g(t))dt from 0 to pi:

a) Prove the set S={cost, sint, 1, t} is linearly independent.

b) Apply the Gram-Schmidt process to S to get an orthonormal basis of Span(S)


Homework Equations


proj=(<u,v>/||v||²)*v


The Attempt at a Solution


I get that (b) is easy, just a plug and chug.

My question is about a. Of course, any set (and specifically this set for argument's sake) is linearly independent if and only if

a*cost+b*sint+c*1+d*t=0, only for (a,b,c,d)=(0,0,0,0)

Or equivalently, if any of the vectors can be expressed as a linear combination of the others.
The second part is NOT equivalent to the first. If any vector can be expressed as a linear combination of the others, the set is linearly dependent, not linearly independent.
explorer58 said:
For example:

cost=a*sint+b*1+c*t
If for all values of t, this equation has a nontrivial solution for the constants a, b, and c (not all zero), then the set {cos(t), sin(t), 1, t} is linearly dependent.
explorer58 said:
So, I need the best way to prove they are linearly independent. Two ideas passed through my mind. The first (which I'm not positive is correct) is that the vector would be the sum of its projections on the other vectors. So

a=<cost,sint>/<sint,sint> * sint
b=<cost,1>/<1,1> * 1

etc.

And the other idea was to evaluate the functions and different values of t, and prove that a,b, and c could not possible be constants, and therefore the functions are not linear combinations of each other.

Are either of these the right way to go about this?

Thanks for your help ahead of time, and sorry about for my lack of latex knowledge.
 
  • #7
Gah, the first part was just a slip of the tongue. It was supposed to say cannot be expressed.
 
  • #8
explorer58 said:
Forgive me ahead of time, I don't really know how to use LaTeX, (it's on my to do list).

Homework Statement


Given the vector space C([0,pi]) of continuous, real valued functions on the given interval, as well as the inner product <f,g>=integral(f(t)*g(t))dt from 0 to pi:

a) Prove the set S={cost, sint, 1, t} is linearly independent.

b) Apply the Gram-Schmidt process to S to get an orthonormal basis of Span(S)


Homework Equations


proj=(<u,v>/||v||²)*v


The Attempt at a Solution


I get that (b) is easy, just a plug and chug.

My question is about a. Of course, any set (and specifically this set for argument's sake) is linearly independent if and only if

a*cost+b*sint+c*1+d*t=0, only for (a,b,c,d)=(0,0,0,0)

Or equivalently, if any of the vectors can be expressed as a linear combination of the others.
No, not "equivalently". The first is, as you say "linearly independent" but the second characterizes "linearly dependent"!

For example:

cost=a*sint+b*1+c*t

So, I need the best way to prove they are linearly independent. Two ideas passed through my mind. The first (which I'm not positive is correct) is that the vector would be the sum of its projections on the other vectors. So

a=<cost,sint>/<sint,sint> * sint
b=<cost,1>/<1,1> * 1

etc.
That would work but is an awful lot of unnecessary computation.

And the other idea was to evaluate the functions and different values of t, and prove that a,b, and c could not possible be constants, and therefore the functions are not linear combinations of each other.
Yes, that would work. And since you have only three "unknowns", a, b, and c, you need only three equations and so only three different values of t.
a cos(t)+ bsin(t)+ c= 0
Taking t= 0, that becomes a+ c= 0.
Taking [itex]t= \pi/2[/itex], that becomes b+ c= 0.
Taking [itex]t= \pi[/itex], that becomes -a+ c= 0.

Are either of these the right way to go about this?

Thanks for your help ahead of time, and sorry about for my lack of latex knowledge.
 

1. What is the definition of linearly independent functions?

Linearly independent functions are a set of functions that cannot be expressed as a linear combination of each other. This means that none of the functions in the set can be written as a constant multiple or sum of the other functions.

2. How do you determine if a set of functions is linearly independent?

A set of functions is linearly independent if the only way to satisfy the equation a1f1(x) + a2f2(x) + ... + anfn(x) = 0 is for all the coefficients a1, a2, ..., an to be equal to 0.

3. What is the significance of linearly independent functions in linear algebra?

In linear algebra, linearly independent functions play a crucial role in creating a basis for a vector space. A basis is a set of linearly independent vectors that can be used to represent any vector in the vector space. This allows for easier computation and analysis of vectors and their operations.

4. Can a set of functions be linearly independent in one vector space but dependent in another?

Yes, it is possible for a set of functions to be linearly independent in one vector space but dependent in another. This is because the concept of linear independence is specific to a particular vector space, and the same set of functions may have different relationships in different vector spaces.

5. How does the concept of linear independence relate to linear transformations?

Linear independence is closely related to linear transformations because a set of linearly independent functions can be used to represent a linear transformation. This allows for the simplification of operations and calculations involving linear transformations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
284
  • Calculus and Beyond Homework Help
Replies
0
Views
450
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
460
  • Calculus and Beyond Homework Help
Replies
7
Views
413
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
581
  • Calculus and Beyond Homework Help
Replies
21
Views
3K
  • Calculus and Beyond Homework Help
Replies
8
Views
622
Back
Top