On the definition linear independence and dependence

Click For Summary

Homework Help Overview

The discussion revolves around the definitions of linear independence and dependence within the context of vector spaces and polynomial spaces. The original poster questions whether the standard definitions apply similarly to polynomial spaces and explores the implications of these definitions.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to define linear independence and dependence for polynomial spaces, questioning if these definitions align with those in vector spaces. Some participants discuss the nuances of these definitions, particularly regarding the conditions under which linear dependence is determined.

Discussion Status

Participants are engaged in clarifying the definitions and implications of linear independence and dependence in polynomial spaces. Some guidance has been offered regarding the nature of solutions to the equations involved, but there is no explicit consensus on the definitions being correct or incorrect.

Contextual Notes

There is an ongoing discussion about the nature of coefficients in polynomial expressions and their relationship to the functions themselves, as well as the implications of defining linear dependence based on specific values versus all values.

cartonn30gel
Messages
68
Reaction score
0
This is not really a homework problem but it relates to a number of problems, so I thought this would be the most appropriate place to post it.

Homework Statement



The basic question is about how we define linear dependence in a vector space. For a vector space over some field [tex]\mathbb{F}[/tex], we know that the vectors [tex]v_1,v_2,...,v_n[/tex] are linearly independent if the only solution to [tex]a_1v_1+a_2v_2+...+a_nv_n= 0[/tex] is [tex]a_1=a_2=...=a_n=0[/tex]. And if some list of vectors are not linearly independent, they are linearly dependent. This means if we can find constants [tex]b_1,..., b_n[/tex] that are NOT ALL ZERO where [tex]b_1v_1+b_2v_2+...+b_nv_n=0[/tex], these vectors are linearly dependent.

Now when we think about the polynomial space (let's say over some field) [tex]P_m(\mathbb{F})[/tex], we need to reconsider these definitions right? I have never seen a different definition of linear independence for the polynomial space but I'm assuming it would be like this:
The vectors [tex]p_1(z),...,p_m(z)[/tex] are linearly independent if the only solution to [tex]c_1p_1(z)+...+c_mp_m(z)=0[/tex] FOR ALL z is [tex]c_1=...=c_m=0[/tex].

And linear dependence would be this: If we can find constants [tex]c_1, ..., c_m[/tex] that are not all zero FOR SOME z where [tex]c_1p_1(z)+...+c_mp_m(z)=0[/tex], these vectors are linearly dependent.

Notice that if linear dependence on the polynomial space is defined this way, it is actually the exact negation of linear independence in the polynomial space. But if linear dependence is defined in the similar way but FOR ALL z, then it is NOT the exact negation of linear independence in the polynomial space. And then, we start encountering vectors that are neither linearly independent nor linearly dependent.

Please try to shed some light on this. Is my given definitions for linear independence and dependence for the polynomial space correct?

Homework Equations


Known definitions of linear independence and dependence are given above.

The Attempt at a Solution


There isn't really a solution. It is just a question about definitions.
 
Last edited:
Physics news on Phys.org
When you see an equation like [tex]c_{1}f_{1}(z) + ... + c_{n}f_{n}(z) = c[/tex] where [tex]c[/tex] is some constant, then [tex]r[/tex] is taken to be the function [tex]r(z) = r[/tex]. This takes care of all of the details because [tex]f(z) = 0(z)[/tex] iff [tex]f[/tex] is zero on all [tex]z[/tex] so if it is non-zero on at least one [tex]z[/tex] then it's not the zero function.
 
Last edited:
Looks fine to me. You seem to get the fine point that for a linearly independent set of vectors, this equation has only one solution.
[tex]a_1v_1+a_2v_2+...+a_nv_n= 0[/tex]

The same equation for an arbitrary set of vectors, whether linearly independent or linearly dependent, always has the solution a1 = a2 = ... . an = 0. The key difference, and one that students have a hard time with, is whether this solution is the only one.

Your definition for linear independence in a function space is fine, too. A key point there is that the equation has to hold for all values of the variable.
 
aPhilosopher,
I can't see how what you said is relevant in this case. I am talking about the specific coefficients that multiply the polynomials. Are they functions of z as well? (I think they aren't because it is called "scalar" multiplication) Can you give some more details please?

And Mark44,
so linear dependence in the vector space of polynomials is still the exact negation of linear independence. And its definition of it is:

If we can find constants [tex]c_1, ..., c_m[/tex] that are not all zero FOR SOME z where [tex]c_1p_1(z)+...+c_mp_m(z)=0[/tex], these vectors are linearly dependent.

Correct?

If it is correct, I will post an example question and my solution to it making use of this fact.
 
Last edited:
cartonn30gel said:
aPhilosopher,
I can't see how what you said is relevant in this case. I am talking about the specific coefficients that multiply the polynomials. Are they functions of z as well? (I think they aren't because it is called "scalar" multiplication) Can you give some more details please?

And Mark44,
so linear dependence in the vector space of polynomials is still the exact negation of linear independence. And its definition of it is:

If we can find constants [tex]c_1, ..., c_m[/tex] that are not all zero FOR SOME z where [tex]c_1p_1(z)+...+c_mp_m(z)=0[/tex], these vectors are linearly dependent.

Correct?
"... these functions are linearly dependent." Minor point, as functions in a function space are the counterparts of vectors in a vector space.
cartonn30gel said:
If it is correct, I will post an example question and my solution to it making use of this fact.
 
cartonn30gel said:
aPhilosopher,
I can't see how what you said is relevant in this case. I am talking about the specific coefficients that multiply the polynomials. Are they functions of z as well? (I think they aren't because it is called "scalar" multiplication) Can you give some more details please?

I noticed that my notation was a bit confusing so I edited my previous post. the coefficients [tex]c_{i}[/tex] are constant but the polynomials [tex]f_{i}[/tex] are not. So when you take the linear combination [tex]\sum_{i} c_{i}f_{i}[/tex], the result is a function. If you set this equal to something, that something had better be a function as well. So functions [tex]f_{i}[/tex] are linearly dependent if there exists [tex]c_{i}[/tex], not all zero, such that [tex]\sum_{i} c_{i}f_{i}[/tex] is the zero function. A function is the zero function if and only if it is zero on every element of it's domain. It is not the zero function if it is non-zero on at least one element of it's domain.

The point is that all the business about "at least"/"for all" is already taken care of in the definition of the zero function and the realization that zero means the zero function in this context.

That being said, it is a good observation.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K