Symbolic computation of gradient

Click For Summary
The discussion focuses on finding efficient symbolic methods to compute gradients with respect to vectors without breaking down the computation into individual components. A specific example is provided, demonstrating that using symbolic manipulation can simplify the gradient calculation of functions involving vector norms. The conversation highlights that expressing the norm in terms of its components can lead to quicker results. Participants confirm that these symbolic approaches can yield correct and efficient outcomes. Overall, the thread emphasizes the value of symbolic computation shortcuts in gradient evaluation.
v0id
Messages
42
Reaction score
0
I'm wondering if there are any convenient symbolic "shortcuts" (i.e. abuse of notation) that enable one to compute the gradient with respect to a certain vector, without decomposing the computation into the vector's individual elements and differentiating with respect to each element. For example:
<br /> \nabla_x \left( \frac{1}{|{\bf x}^{&#039;} - {\bf x}|} \right) = \frac{{\bf x}^{&#039;} - {\bf x}}{|{\bf x}^{&#039;} - {\bf x}|^3}<br />
Besides the obvious method of evaluating \frac{\partial}{\partial x_1} and so on, is there a faster method of symbolic computation?
 
Physics news on Phys.org
If you consider
\sqrt{x^Tx}=|x|
then symbolic ideas help to get to the final expression:

\nabla_x \left( (x&#039;-x)^{T}(x&#039;-x) \right)^{-1/2} = \left( (x&#039;-x)^{T}(x&#039;-x) \right)^{-3/2} (x&#039;-x)
 
Last edited:
Yes, that works very well. Thanks a lot.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
4K
  • · Replies 38 ·
2
Replies
38
Views
7K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
5K