Hw Help- Linear Algebra

1. Jan 30, 2005

bor0000

1a) Determine whether this subset of C(R) is linearly independent or not; the dimension of the subspace and its basis
S=(e^ax, x*e^ax, x^2*e^ax)
this question seems to be too easy, they seem independent to me for obvious reasons. adn those 3 elements are the basis, and the dimension is then obviously 3. or am i wrong somewhere?

and it asks to show that one can find arbitrarily large finite sets of independent functions in C(R). then i take it x, x^2, x^3, etc are independent, now how do i show/prove this?

thanks!

2. Jan 30, 2005

HallsofIvy

C(R) is the set of functions continous on all real numbers.

Yes, S is a set of independent functions. I don't know what your "obvious reasons" are- can you show that the definition of "independent" holds? Yes, since they are three independent "vectors", the subspace they span has dimension 3 and S is itself a basis.,

Given any n, show that the functions y= x0= 1, x1, x2, ..., xn are independent. What IS the definition of "independent"?

3. Feb 4, 2005

bor0000

thanks! im just starting to do this hw today. I think i know how i'll show this, set it up as some kind of matrix, and try to get it so that each row has more zeroes than the previous one.

4. Feb 4, 2005

HallsofIvy

Well yes, you could do that- I think that's "overkill".

Suppose a0x0+ a1x1+ a2x2+ ...+ anxn= 0 (for all x). Take x= 0 and you get a0= 0! Now differentiate both sides of that:
a1+ a2x+ a3x2+ ...+ anxn-1= 0. Let x= 0 again: a1= 0. Continue!

5. Feb 5, 2005

matt grime

Or you could use the fact that a degree n poly only has at most n roots.

6. Feb 5, 2005

bor0000

thanks! i dont understand how i can use the 'derivatives' or 'roots' making it independent. anyway i did those problems by setting up a matrix like 1*a0+0*a1+0*a2 for row1, etc. it was simple.
http://www.math.mcgill.ca/schmidt/247w05/ass2.pdf
sorry i'm posting my whole assignment, i dont intend to copy all of it, i can also go to professors' office in the morning or helpdesk in the afternoon, but i am busy studying chem hw now, and so dont know if i'll make it.
so in #4, i set up a matrix and labelled it horizontally as the normal basis, i.e. 1,x,x^2..(i dont think i need to prove that its a basis for p(n)), label it V, and vertically the basis that was given in question, label it U. Then i would need to prove that every element in U can be expressed uniquely as a combination of V's. i.e. column 1 will be 1,0...,0, column2 will be -a,1...0, and so i need to prove that it will have a triangular form like this. I have no way to formally prove this, but i wonder if its enough that it just seems to me intuitively that they have the same power j, and the other elements have powers less than j, and so the elements in the column below j will be 0, but whatever they are above j doesnt matter, since they will be something.
So please tell me if this is enough, or if not, any other ideas to do this?
And for the second part where they ask for the coefficient of p(x), does it mean they want the coefficient of the largest term, i.e. 2x^3+1 would have a coefficient 2? Then i used the taylor expansion and got D^k(a)/k!

And in #6a) what is reflection?? I looked in my notes and i am not sure about this, but i think it says that if x has 2 vectors, in this case v1=(1,0,1) and v2=(0,1,-2), then find the vector that is orthogonal to them, which is 1,-2,-1 , and to get the reflection just right these 3 vectors as columns except that orthogonal one is multiplied by -1. I drew a parallipeped to picture this, and i dont understand why this would mean it's a reflection.

And in 6b) to get P i also wrote the 3 vectors as columns, but set the orthogonal vector to 0,0,0 but then for Q do i do the same but set one of the other vectors to 0? if so which one and why?

thanks!

Last edited by a moderator: Apr 21, 2017
7. Feb 7, 2005

8. Feb 7, 2005

matt grime

Ok, let me try to be more helpful

You want to show that $$\sum_{r=0}^{n} a_r x^r$$ is the ZERO FUNCTION, that is it is zero for all values of x. A degree n polynomial, which that is, has at most n roots for n greater than 1 (I am assuming a_n is not zero, which we may do). So if it were the zero function (ie all values of x are roots) then you have a cotnradiction.

9. Feb 7, 2005

fourier jr

find the determinant of: $$\left( \begin{array}{ccc}f & g & h \\f' & g' & h' f" & g" & h"\end{array} \right)$$, where f g & h are the functions you want to show are linearly independent, & see if it's zero anywhere.

edit: wtf why doesn't my tex ever work? it always says 'page cannot be displayed'

Last edited: Feb 7, 2005
10. Feb 8, 2005

bor0000

ok thanks!

11. Feb 8, 2005

mathwonk

one satndard trick to show functiuons are independent is to assume a dewpendency relation and then differentiate it to get more, eventually one whioch is impossible.

e.g. to show e^ x and e^(2x) ibdependent, assume that ae^x + b e^(2x) = o (the zero function.

then also by differentiating, ae^x + 2be^(2x) = 0, so if x = 0, i get a + 2b = 0.

but in the original equation, x=0 gives a+b = 0, so b = 0. but then i have ae^x = 0, so putting x=0 again gives me a = 0.