Understanding Divergence and Gradient in Vector Fields

  • Thread starter Thread starter asi123
  • Start date Start date
  • Tags Tags
    Divergence
Click For Summary
Divergence measures how much a vector field spreads out from a point, resulting in a scalar field, while the gradient transforms a scalar field into a vector field. For the vector field F = x^2i + y^2j + z^2k, the correct divergence is ∇·F = 2x + 2y + 2z, not a vector. The gradient points in the direction of the fastest increase of a scalar function and its length represents the rate of increase. The Laplacian, denoted as ∇²f, is the result of applying divergence to the gradient of a scalar function, representing a key second-order differential operator. Understanding these concepts is essential for analyzing vector fields in physics and engineering.
asi123
Messages
254
Reaction score
0
What is the Divergence? is it only the Partial derivatives?

Lets say I have a vector field: F=x^2i+y^2j+z^2k, the divergence is F=2xi+2yj+2zk?

And if it is, than what is the gradient?:confused:
 
Last edited by a moderator:
Physics news on Phys.org


A divergence is evaluated of a vector field, while the gradient (assuming you mean grad) is done for scalar fields. A related operation, the curl is performed on a vector field.

So we have:
curl: vector field -> vector field
div: vector field -> scalar field
grad: scalar field -> vector field

I'm wondering if there is any defined operation such that we can get a scalar field from a scalar field?
 


asi123 said:
What is the Divergence? is it only the Partial derivatives?

Lets say I have a vector field: F=x^2i+y^2j+z^2k, the divergence is F=2xi+2yj+2zk?
No. the diverence of this vecor field is the scalar function \nabla\cdot F= 2x+ 2y+ 2z. The "\cdot" in that notation is to remind you of a dot product: the result is a scalar.

And if it is, than what is the gradient?:confused:

The gradient is, in effect, the "opposite" of the divergence: it changes a scalar function to a vector field: at each point \nabla f points in the direction of fastest increase and its length is the derivative in that direction.

Notice that if you start with a scalar function, the gradient gives a vector function and you can then apply the divergence to that going back to a scalar function:
\nabla\cdot (\nabla f)= \nabla^2 f[/itex]<br /> called the &quot;Laplacian&quot; of f. That is a very important operator: it is the simplest second order differential operator that is &quot;invariant under rigid motions&quot;.
 


Got it, thanks.
 
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K