Tensor Differentiation

  • Thread starter Nickpga
  • Start date
  • #1
Nickpga
22
2
Summary:: help explaining notation with derivatives.

Mentor note: Thread moved from technical section, so no homework template is included
Sorry. I did not realize there was a dedicated homework problem section. Should I leave this post here?

Basically the following (homework) problem. I haven't dealt with tensors before. well. not explicitly tensors i suppose.

b_ij are constants, show that

(b_ij x_j)_,k = b_ik

what i know is that i will make a partial derivative

b_ij dx_j/dx^k = b_ik

how does the derivative simplify to the right handside?

for a little bit more context, i am taking an intro to multiferroics (not my choice, i am forced to by my university). and i am posting it in this thread since google search results abut tensors lead me to questions posted in this sub-forum. thanks for your time
 
Last edited by a moderator:

Answers and Replies

  • #2
anuttarasammyak
Gold Member
1,928
1,007
You wrote
[tex](b_{ij}x_j)_{,k}=b_{ik}[/tex]
But thinking balance of dummy index j, it should be
[tex](b_{ij}x^j)_{,k}=b_{ik}[/tex]
Which is the right equation in the problem ?
 
  • #3
Nickpga
22
2
The first one. thanks
 
  • #4
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
The first one. thanks
What you want to calculate is: $$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j)$$ We might as well be explicit about this for the sake of clarity.

Does it look clearer what to do now?
 
  • #5
Nickpga
22
2
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
 
  • #6
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Have you ever taken a calculus course?
 
  • #7
Nickpga
22
2
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
I am taking my last two courses to complete my BSEE...
 
  • #8
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
So, you have two issues:

1) The basic calculus of (partial) differentiation.

2) Getting accustomed to the hyper-concise (my term) convention used in your course: with the summation convention and derivatives represented using commas.

What if we simplify the problem further and take ##k = 1##:$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j)$$
 
  • #9
Nickpga
22
2
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?
 
  • #10
anuttarasammyak
Gold Member
1,928
1,007
Then
b_ij dx_j/dx^k = b_ik
Do you mean
[tex]b_{ij}\frac{\partial x_j}{\partial x^k}[/tex]
with index k is only one upside ?
 
  • #11
Nickpga
22
2
Well. I do not know what the difference is between up and down. I am basically learning tensors (specifically) for my first time. Weird how everyone in the course already knew them really well.
 
  • #12
anuttarasammyak
Gold Member
1,928
1,007
You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?
 
  • #13
Nickpga
22
2
Ok. I figure its all down. Nothing in the homework is actually up.
 
  • #14
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...
 
  • #15
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?
Not all tensor analysis uses upstairs indices.
 
  • #16
Nickpga
22
2
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?
 
  • #17
anuttarasammyak
Gold Member
1,928
1,007
I see. So
[tex](b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}[/tex]
with convention that same index are summed up. ##\delta_{jk}## is Kronecker delta, 1 for j=k, 0 for j##\neq##k.
 
  • #18
Nickpga
22
2
I see. So
[tex](b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}[/tex]
Alright. That makes sense. Only because the previous problem dealt with the Kronecker delta as well. Thanks!
 
  • #19
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,996
15,678
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.
 
  • #20
Nickpga
22
2
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.
Thanks for your time and help!
I will probably have to work in regular notation and then see how to condense into the new notation.
 
  • #21
wrobel
Science Advisor
Insights Author
997
860
Just some remarks.
1) ##b_{ij}x_j## is not a tensor at least this expression does not keep its shape under changes of variables
2) the operation ##\partial/\partial x_i## takes tensors to not-tensors
3) if only linear changes are considered ##x_i=c_{ij}x'_j## then everything is ok
 
Last edited:
  • #22
WWGD
Science Advisor
Gold Member
6,318
8,347
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Try Schaums' Tensor Calculus for an overview.
 
  • Like
Likes MD LAT 1492

Suggested for: Tensor Differentiation

  • Last Post
Replies
5
Views
501
  • Last Post
Replies
28
Views
732
Replies
2
Views
377
Replies
1
Views
534
Replies
2
Views
640
  • Last Post
Replies
3
Views
630
Replies
19
Views
917
Replies
9
Views
729
Replies
2
Views
457
Replies
5
Views
1K
Top