\int e^\sqrt[3]{x} dx
Integration by parts, perhaps? But if that's the case, I have no idea which is right value for u and which is the right one for dv... Taking ln on both sides? Uh...hmm...I don't think that's how you work this question out...
Any ideas, guys? :|
Thanks!
Oh, I forgot the important fact that the diagonal consists only zero in a skew-symmetric matrix. Yes, the only basis for it would be [0 1, -1 0] then.
So to prove that a matrix is a basis, I just need to show that its vectors(columns) are linearly independent, right?
Ah! So say for matrix:
[1 4]
[9 0]
(which is the same as [1 4, 9 0] to make representation easier)
It can have a basis like this:
1[1 0, 0 0]+4[0 1, 0 0]+9[0 0, 1 0]+0[0 0, 0 1]
right?
Since this matrix reduces to the identity matrix, should the original matrix itself be the...
Prove: the set of 3x3 symmetric matrices is a vector space and find its dimension.
Well in class my prof has done this question, but I still don't quite get it..
Ok, first off, I need to prove that it's a vector space. The easy way is probably to prove that it contains the zero space and...
Hmm...I just found this weird problem:
Let n and m be distinct non-zero vectors in R3, and let b be an arbitrary vector in R2. Is W={b\inR2|(n [dot] x, m [dot] x)=b for some x in R3} a subspace of R2? R3?
First of all, it concerns both R2 and R2, and I am not sure what's the approach for...
Alright!
So for example, if I was asked to fine the matrix of a linear mapping L:R2 -> R3 whose nullspace is Sp({(1,1)}) and range is Sp({(1,2,3)}), then the matrix turns out to be:
[1 -1]
[2 -2]
[3 -3]
right?
Oh, I see. I may need to confirm a few more examples to be sure that I am apply this theorem correctly.
Now suppose {x\inR5 | ||x||2 \geq 0}. First of all, it concerns R5. Since ||x|| is the norm of the vector, which is the same thing as the distance of the vector, so it will always be...
An safer approach than the product rule is to expand it completely then look for the derivative. This method is reasonable since the expression you have there isn't too complicated for expansion.
I guess this kind of topic should belong here. :|
My understanding of the subspace still isn't solid enough, so I want to know what I know so far is at least correct.
By definition, a set of vectors S of Rn is called a subspace of Rn iff for all vectors (I will call them x):
1) (x+y) \in S and...
My understanding of the subspace still isn't solid enough, so I want to know what I know so far is at least correct.
By definition, a set of vectors S of Rn is called a subspace of Rn iff for all vectors (I will call them x):
1) (x+y) \in S and
2) kx \in S.
Also, the solution set of a...
It seems like the a=-2 is giving a negative value under the first square root expression..Hmmm. 8|
I have to finish my homework at the moment, and I'll come back with my calculation tomorrow morning, hopefully. (hope your homework isn't due tomorrow as well..D:)