Taylor expansion of a vector field

In summary, the conversation discusses the possibility and plausibility of approximating vector fields using Taylor expansions. The first term would involve the Jacobian while the second term would require finding a linear operator that is quadratic in the arguments. The conversation also explores the use of an Identity Matrix and the calculation of a tensor to play the role of the second derivative for vector fields. The final approximation is written in terms of the Jacobian and the newly defined operator, H.
  • #1
Trifis
167
1
I was wondering if such an approximation is possible and plausible...

The first term would have to look sth like this: [itex]\vec{f}[/itex]([itex]\vec{x_{0}}[/itex]) + [itex]\textbf{J}[/itex][itex]_{[itex]\vec{f}[/itex]}[/itex]([itex]\vec{x_{0}}[/itex])[itex]\cdot[/itex]([itex]\vec{x}[/itex]-[itex]\vec{x_{0}}[/itex])

No clue about the second term though...
We would have to calculate the Jacobian of the Jacobian (like we calculate the Jacobian of the Gradient to get the Hessian for the second term of the regular case of scalar fields f: ℝ[itex]^{n}[/itex]→ℝ) or sth...
 
Last edited:
Physics news on Phys.org
  • #2
The analogy for Taylor expansions of vector fields is most easily seen through directional derivatives.

[tex]f(r) = f(r_0) + (r - r_0) \cdot \nabla' f(r') |_{r' = r_0} + \frac{1}{2!} ([r - r_0] \cdot \nabla')^2 f(r') |_{r' = r_0} + \ldots[/tex]

But yes, the first-order term is the Jacobian, can be interpreted as a matrix operation, etc. The second term is more complicated, though, because it's obviously quadratic in [itex]r - r_0[/itex]. So you would need some sort of operator that is linear on two arguments.
 
  • #3
Yes indeed, a (1,2) tensor would do the job...

The question is of course if such an operator could be defined to play the role of the second derivative for vector fields. I had no luck of finding one neither with my calculus books nor with the internet so far...
 
  • #4
you just do it component by component.
[tex]f^i(x) = f^i(x_0) + \left. \frac{\partial f^i}{\partial x^j}\right|_{x=x_0}(x^j - x_0^j)
+ \left. \frac{1}{2!} \frac{\partial^2 f^i}{\partial x^j \partial x^k}\right|_{x=x_0}(x^j - x_0^j)(x^k - x_0^k)
+ \cdots [/tex]

Repeated indices are summed.
 
  • Like
Likes vitya-garcia
  • #5
Sry for the late reply...

Yes this seems to be the right way to do this approximation. Thank you!
 
  • #6
Well, you could take that term from your first post:

[itex]F = \vec{\eta}\cdot\vec{\nabla}[/itex]

where I let

[itex]\vec{\eta} = \vec{x}-\vec{x}_0[/itex]

and multiply it by the Identity Matrix, [itex]I[/itex],

[itex]I = \sum_k{\left|k\right\rangle\left\langle k\right|}[/itex]

from the left and from the right:

[itex]D \equiv IFI = \sum_{i,j}{\left|i\right\rangle\left\langle i\right| \vec{\eta}\cdot\vec{\nabla} \left|j\right\rangle\left\langle j\right|}[/itex]
[itex]= \sum_{i,j}{\left|i\right\rangle\left\langle i\right| \eta_j\frac{\partial}{\partial x_j}}[/itex]

Notice that, then, the vector function [itex]\vec{f}\left(\vec{x}\right)[/itex] is written as:

[itex]\vec{f}\left(\vec{x}\right) = \sum_k{\left|k\right\rangle f_k\left(\vec{x}\right)}[/itex]

so that

[itex]D\vec{f} = \sum_{i,j,k}{\left|i\right\rangle\left\langle i\right| \eta_j\frac{\partial f_k}{\partial x_j} \left|k\right\rangle} = \sum_{i,j}{\eta_j\frac{\partial f_i}{\partial x_j} \left|i\right\rangle}[/itex]

Notice also that [itex]D^2[/itex] is, explicitly:

[itex]D^2 = \sum_i{\left(\vec{\eta}\cdot\vec{\nabla}\right)^2 \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= \sum_i{\left(\vec{\eta}\cdot\vec{\nabla}\right) \left(\vec{\eta}\cdot\vec{\nabla}\right) \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= \sum_{i,j,k}{\eta_j\partial _j\left(\eta_k\partial _k\right) \left|i\right\rangle\left\langle i\right|}[/itex]

Assuming [itex]\partial _i=\frac{\partial}{\partial x_i}[/itex] (then [itex]\partial _j\eta_i=\delta_{ij}[/itex] -- Kronecker delta); thus:

[itex]D^2 = \sum_{i,j,k}{\eta_j\partial _j\left(\eta_k\partial _k\right) \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= \sum_{i,j,k}{\eta_j\left(\partial _j\eta_k\partial _k + \eta_k\partial _{jk}\right) \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= \sum_{i,j,k}{\eta_j\partial _j\eta_k\partial _k \left|i\right\rangle\left\langle i\right|} + \sum_{i,j,k}{\eta_j\eta_k\partial _{jk} \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= \sum_{i,j}{\eta_j\partial _j \left|i\right\rangle\left\langle i\right|} + \sum_{i,j,k}{\eta_j\eta_k\partial _{jk} \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= D + \sum_{i,j}{\eta_j\eta_k\partial _{jk} \left|i\right\rangle\left\langle i\right|}[/itex]

And finally:

[itex]\sum_{i,j,k}{\eta_j\eta_k\partial _{jk} \left|i\right\rangle\left\langle i\right|} = \sum_{i,j,k}{\eta_j \eta_k \frac{\partial^2}{\partial x_j \partial x_k} \left|i\right\rangle\left\langle i\right|}[/itex]
[itex]= D^2-D=D(D-1)\equiv H[/itex]

Then you may define [itex]H[/itex]: (which I'll call [itex]H[/itex] because of the similarity with the Hessian Matrix):

[itex]H \equiv D(D-1) = \sum_i{\left|i\right\rangle \vec{\eta}\cdot\vec{\nabla} \left(\vec{\eta}\cdot\vec{\nabla}-1\right)\left\langle i\right|} = \sum_{i,j,k}{\eta_j \eta_k \frac{\partial^2}{\partial x_j \partial x_k} \left|i\right\rangle\left\langle i\right|}[/itex]

so that:

[itex]H\vec{f} = \sum_{i,j,k,l}{\eta_j \eta_k \frac{\partial^2 f_l}{\partial x_j \partial x_k} \left| i \right\rangle \left\langle i \right|\left. l \right\rangle} = \sum_{i,j,k}{\eta_j \eta_k \frac{\partial^2 f_i}{\partial x_j \partial x_k} \left| i \right\rangle}[/itex]

because [itex]\left\langle i \right|\left. l \right\rangle=\delta_{il}[/itex].

Now, your expansion may be written as:

[itex]\vec{f}\left(\vec{x}_0+\vec{\eta}\right) = \vec{f}\left(\vec{x}_0\right) + D\left.\vec{f}\right|_{\vec{x}=\vec{x}_0} + \frac{1}{2}H\left.\vec{f}\right|_{\vec{x}=\vec{x}_0} + O\left(\vec{\eta}^3\right)[/itex]
 
Last edited:

1. What is the purpose of the Taylor expansion of a vector field?

The Taylor expansion of a vector field is a mathematical tool used to approximate a vector field at a specific point by using a series of polynomials. This allows for a more accurate representation of the vector field and can be used to make predictions or solve problems in various fields such as physics, engineering, and economics.

2. How is the Taylor expansion of a vector field calculated?

The Taylor expansion of a vector field is calculated by taking the derivatives of the vector field at a specific point and using them to construct a series of polynomials. These polynomials are then combined to create an infinite series, which can be truncated to a desired degree of accuracy.

3. What are some applications of the Taylor expansion of a vector field?

The Taylor expansion of a vector field has many applications in various fields of science and engineering. It can be used to approximate solutions to differential equations, predict the behavior of systems, and analyze the behavior of physical phenomena.

4. Can the Taylor expansion of a vector field be used for non-linear vector fields?

Yes, the Taylor expansion of a vector field can be used for non-linear vector fields. However, the higher order terms in the expansion may become more complicated and difficult to calculate, making the approximation less accurate.

5. Are there any limitations to the Taylor expansion of a vector field?

Yes, there are some limitations to the Taylor expansion of a vector field. The accuracy of the approximation depends on the smoothness of the vector field and its derivatives. Additionally, the expansion may not converge if the vector field has a singularity or discontinuity at the point of expansion.

Similar threads

Replies
2
Views
1K
Replies
1
Views
760
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
770
  • Calculus
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus
Replies
15
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
206
  • Quantum Physics
Replies
5
Views
545
  • Introductory Physics Homework Help
Replies
12
Views
206
Back
Top