# I Linear independence of functions

1. Dec 8, 2016

### Mr Davis 97

Is there a difference between the linear independence of $\{x,e^x\}$ and $\{ex,e^x\}$? It can be shown that both only have the trivial solution when represented as a linear combination equal to zero. However, the definition of linear independence is: "Two functions are linearly independent on the interval $I$ if there exists only the trivial solution to $c_1f_1 + c_2f_2 + ... + c_nf_n = 0$ for all x in $I$. In the first case, this is obvious since x and e^x never intersect, and so cannot be multiples of each other. However, doesn't the latter case violate this definition since $e(1)$ is a multiple of $e^1$? I am just confused about the "for all x on I" statement at the end of the definition.

2. Dec 8, 2016

### Stephen Tashi

Intersection or non-intersection is irrelevant. The graphs of $f_1(x) = x^2+1$ and $f_2(x) = 2x^2+2$ don't intersect, but $(-2) f_1(x) + (1)f_2(x) = 0$ at each value of $x$.

No. At x = 1, we have $c_1 (ex) + c_2(e^x) = 0$ for $c_1 = 1$ and $c_2= -1$, but those values $c_1, c_2$ are not solutions that apply to each value of $x$ in some interval. They only work at one particular value of $x$.

3. Dec 8, 2016

### Staff: Mentor

For linear independence, a multiplication with elements of your field does not matter at all: x and e*x are equivalent. You multiply them with an arbitrary constant ci anyway.
I don't understand that argument. In fact, if two functions intersect (but are not identical), they cannot be multiples of each other. If they never intersect, they can (don't have to) be multiples of each other.

4. Dec 8, 2016

### lurflurf

it does not violate the definition because
(-1)e x+(1)e^x=0 if x=1 but not for all x
that is confusing because the "for all x" applies to
$c_1f_1 + c_2f_2 + ... + c_nf_n = 0$
not
there exists only the trivial solution

5. Dec 9, 2016

### Ssnow

you said correctly at the end '' for every $x \in I$ '' this must happen uniformly in $I$ ...