Linear independence


by sncum
Tags: independence, linear
sncum
sncum is offline
#1
Mar14-12, 10:05 AM
P: 14
Any one plz tell me about the term linear independence?and when we say that the function is linear independent
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
mathman
mathman is offline
#2
Mar14-12, 03:25 PM
Sci Advisor
P: 5,941
You need to supply context. Usually linear independence refers to a set of vectors or a set of functions. These things are called linearly dependent if some linear combination = 0. If not then they are linearly independent.
sncum
sncum is offline
#3
Mar15-12, 08:02 AM
P: 14
[A][/A]=[c][/1]x+[c][/2]y[j][/j]+[c][/3]z[k][/k]
when we say it is linearly independent
and also my friend argue with me that orthonormality implies linear independence but i was not satisfied plz help

mathman
mathman is offline
#4
Mar15-12, 03:22 PM
Sci Advisor
P: 5,941

Linear independence


Quote Quote by sncum View Post
[A][/A]=[c][/1]x+[c][/2]y[j][/j]+[c][/3]z[k][/k]
when we say it is linearly independent
and also my friend argue with me that orthonormality implies linear independence but i was not satisfied plz help
I don't understand your first line.

However your friend is correct, if two vectors are orthogonal (unless one of the is 0) they are linearly independent. Note that the converse is not true.
Fredrik
Fredrik is online now
#5
Mar15-12, 08:00 PM
Emeritus
Sci Advisor
PF Gold
Fredrik's Avatar
P: 9,017
Quote Quote by sncum View Post
my friend argue with me that orthonormality implies linear independence but i was not satisfied plz help
A set E is said to be linearly independent if for all finite subsets ##\{x_1,\dots,x_n\}\subset E## and all ##a_1,\dots,a_n\in\mathbb C##,
$$\sum_{k=1}^n a_k x_k=0\quad \Rightarrow\quad a_1=\dots=a_n=0.$$
Suppose that E is orthonormal. Let ##\{e_k\}_{k=1}^n## be an arbitrary finite subset of E, and suppose that ##\sum_{k=1}^n a_k e_k=0##. Then for all ##i\in\{1,\dots,n\}##,
$$0=\langle e_i,0\rangle=\langle e_i,\sum_k a_k e_k\rangle=\sum_k a_k\langle e_i,e_k\rangle=\sum_k a_k\delta_{ik}=a_i.$$
homeomorphic
homeomorphic is offline
#6
Mar15-12, 11:41 PM
P: 1,049
Geometrically, linear independence means each vector contributes something to the span of the vectors.

So, if you have one vector, it spans a 1-dimensional subspace. If you add another vector to the set, you get a linearly dependent set if the span doesn't get any bigger. So, two vectors are linearly dependent if one is a multiple of the other. So, if you added a vector that was pointing in a different direction (and not the opposite direction), together they span a 2-dimensional subspace. In that case, they are said to be linearly independent.

So, in general, if you have n vectors, they are linearly independent if throwing one of them out makes them span a smaller subspace. If you can throw one out without changing the span, they are linearly dependent.

This picture is less accurate, but still helpful, for more general vector spaces, in which the "vectors" aren't exactly "arrows" pointing in space anymore.

Functions are really a kind of vector because vectors are things that you can add together and multiply by scalars. So, it's the same idea for functions.
sncum
sncum is offline
#7
Mar15-12, 11:52 PM
P: 14
Thank to all of you


Register to reply

Related Discussions
Linear Algebra: Using Gaussian elimination to find linear independence in vectors Precalculus Mathematics Homework 1
Linear algebra: subspaces, linear independence, dimension Calculus & Beyond Homework 3
Linear Algebra: Linear Independence and writing Matrices as linear combinations Calculus & Beyond Homework 7
2 Linear Algebra Proofs about Linear Independence Calculus & Beyond Homework 2
Linear Algebra: Linear Transformation and Linear Independence Calculus & Beyond Homework 8