How Does Tensor Differentiation Simplify in Multiferroics Homework?

• Nickpga
In summary: For any ##k##, we have:$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$The next trick is to be able to do that calculation using the compact notation.
Nickpga
Summary:: help explaining notation with derivatives.

Mentor note: Thread moved from technical section, so no homework template is included
Sorry. I did not realize there was a dedicated homework problem section. Should I leave this post here?

Basically the following (homework) problem. I haven't dealt with tensors before. well. not explicitly tensors i suppose.

b_ij are constants, show that

(b_ij x_j)_,k = b_ik

what i know is that i will make a partial derivative

b_ij dx_j/dx^k = b_ik

how does the derivative simplify to the right handside?

for a little bit more context, i am taking an intro to multiferroics (not my choice, i am forced to by my university). and i am posting it in this thread since google search results abut tensors lead me to questions posted in this sub-forum. thanks for your time

Last edited by a moderator:
You wrote
$$(b_{ij}x_j)_{,k}=b_{ik}$$
But thinking balance of dummy index j, it should be
$$(b_{ij}x^j)_{,k}=b_{ik}$$
Which is the right equation in the problem ?

Nickpga
The first one. thanks

Nickpga said:
The first one. thanks
What you want to calculate is: $$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j)$$ We might as well be explicit about this for the sake of clarity.

Does it look clearer what to do now?

Nickpga
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.

Nickpga said:
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Have you ever taken a calculus course?

Nickpga
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
I am taking my last two courses to complete my BSEE...

Nickpga said:
Yes. I have taken lots of math all the way to multi-variable calculus. I am just terrible at it.
So, you have two issues:

1) The basic calculus of (partial) differentiation.

2) Getting accustomed to the hyper-concise (my term) convention used in your course: with the summation convention and derivatives represented using commas.

What if we simplify the problem further and take ##k = 1##:$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j)$$

Nickpga
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?

Then
Nickpga said:
b_ij dx_j/dx^k = b_ik
Do you mean
$$b_{ij}\frac{\partial x_j}{\partial x^k}$$
with index k is only one upside ?

Nickpga
Well. I do not know what the difference is between up and down. I am basically learning tensors (specifically) for my first time. Weird how everyone in the course already knew them really well.

You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?

Nickpga
Ok. I figure its all down. Nothing in the homework is actually up.

Nickpga said:
Ok. I see, now, that you just rewrote the left hand side.

oh k.. so for k = j (take partial), you will get 1 but when k != j, you get zero. orthogonal?
such that, j is kind of replaced by k.

that explains why x_j goes away ( equals to one). and then k replaces, because the values of j (!=k) went to zero.

does that sound correct?
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...

Nickpga
anuttarasammyak said:
You introduced ##x^k## by yourself. I just ask how you or the problem write index up or down. All downside ? OK it makes sense. Some are up and some are down? It is also OK if they follow the balance rule. So what is it ?
Not all tensor analysis uses upstairs indices.

Nickpga
PeroK said:
Sort of. It's really when ##j = k##, as ##j## is the index that is varying and being summed over. If we assume we have three dimensions, then we can expand the whole thing:
$$\frac \partial {\partial x_1}(\sum_j b_{ij}x_j) = \frac \partial {\partial x_1}(b_{i1}x_1 +b_{i2}x_2+b_{i3}x_3) = b_{i1}$$ For the reasons you gave - that ##x_1, x_2, x_3## are assumed to be independent variables.

Although we did this for ##k = 1##, we can that for any ##k## we have:
$$\frac \partial {\partial x_k}(\sum_j b_{ij}x_j) = b_{ik}$$

The next trick is to be able to do that calculation using the compact notation ...
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?

I see. So
$$(b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}$$
with convention that same index are summed up. ##\delta_{jk}## is Kronecker delta, 1 for j=k, 0 for j##\neq##k.

Nickpga
anuttarasammyak said:
I see. So
$$(b_{ij}x_j)_{,k}=b_{ij}x_{j,k}= b_{ij}\delta_{jk}=b_{ik}$$
Alright. That makes sense. Only because the previous problem dealt with the Kronecker delta as well. Thanks!

Nickpga said:
Alright. I suppose I could just write it like this and then "simplifiy" into the hyper-concise form my course uses. Wouldn't this be much better approach than doing it the "tricky" way?
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.

Nickpga
PeroK said:
Personally, I think it makes your course tougher as the human brain takes time to get used to new notation like this. I.e. missing out summation and partial derivative symbols.

Half the battle, perhaps, is to rationalise your difficulties into these two categories: do you understand mathematically what you are doing? and, can you interpret and work with the new notation.
Thanks for your time and help!
I will probably have to work in regular notation and then see how to condense into the new notation.

PeroK
Just some remarks.
1) ##b_{ij}x_j## is not a tensor at least this expression does not keep its shape under changes of variables
2) the operation ##\partial/\partial x_i## takes tensors to not-tensors
3) if only linear changes are considered ##x_i=c_{ij}x'_j## then everything is ok

Last edited:
Nickpga said:
Not really. Do you have any textbook recommendations? I have a lot of catching up to do.
Try Schaums' Tensor Calculus for an overview.

MD LAT 1492

1. What is tensor differentiation?

Tensor differentiation is a mathematical concept used in the field of differential geometry to calculate the rate of change of a tensor field with respect to a given set of variables. It is an extension of traditional calculus, which deals with functions of real numbers, to functions of tensors.

2. What are tensors?

Tensors are mathematical objects that represent the relationships between different coordinate systems. They are multidimensional arrays that can hold a variety of data types, such as scalars, vectors, and matrices. Tensors are commonly used in physics, engineering, and data analysis.

3. How is tensor differentiation different from traditional differentiation?

Traditional differentiation deals with functions of real numbers, while tensor differentiation deals with functions of tensors. This means that the variables in tensor differentiation are multidimensional and can have multiple components, whereas in traditional differentiation, the variables are one-dimensional.

4. What are some applications of tensor differentiation?

Tensor differentiation has a wide range of applications in fields such as physics, engineering, and machine learning. It is used to study the curvature of space-time in general relativity, to model the behavior of materials under stress in engineering, and to optimize neural networks in machine learning.

5. What are some common techniques used in tensor differentiation?

Some common techniques used in tensor differentiation include the chain rule, product rule, and quotient rule. Other techniques include the use of index notation, which simplifies the calculation of derivatives of tensors, and the use of metric tensors to transform between different coordinate systems.

• Advanced Physics Homework Help
Replies
4
Views
2K
• Advanced Physics Homework Help
Replies
2
Views
2K
• Differential Geometry
Replies
1
Views
453
• Advanced Physics Homework Help
Replies
3
Views
2K
• Advanced Physics Homework Help
Replies
7
Views
2K
• Advanced Physics Homework Help
Replies
3
Views
2K
• Advanced Physics Homework Help
Replies
1
Views
4K
• Engineering and Comp Sci Homework Help
Replies
16
Views
1K
• Advanced Physics Homework Help
Replies
1
Views
3K
• Quantum Physics
Replies
12
Views
1K