Orthogonal complement of gradient field?

In summary: However, this is not always the case. For example, if you take the function f(x)=-x^2 then f includes every single x but the result is not b(x)=-x^2.
  • #1
liguolong
4
0
I am doing my research in probability. I have found some probability distribution of a random variable X on the n dimensional unit sphere. Let b be a smooth and lipschitz vector field mapping X to [itex]R^n[/itex]. I have also found that for all continuous differentiable function f mapping X to [itex]R[/itex], the expectation of [itex]\triangledown f\cdot b[/itex] is zero. I have strong feeling that this implies b(X)=0 with probability 1, but I am not sure how I can prove it.
 
Last edited:
Physics news on Phys.org
  • #2
liguolong said:
I am doing my research in probability. I have found some probability distribution of a random variable X on the n dimensional unit sphere. Let b be a smooth and lipschitz vector field mapping X to [itex]R^n[/itex]. I have also found that for all continuous differentiable function f mapping X to [itex]R[/itex], the expectation of [itex]\triangledown f\cdot b[/itex] is zero. I have strong feeling that this implies b(X)=0 with probability 1, but I am not sure how I can prove it.

Hey liguolong and welcome to the forums.

The fact that X is a random variable defined over the n-dimensional unit sphere means that X is a pdf where the pdf is a function of n variables and the CDF is the integral over the n-dimensional space where you will have the limits represent the entangled nature (since you're defining over the surface of the sphere). This implies a continuous distribution which is important to note later on.

Now you then have the constraint about the gradient and it's expectation is zero.

Now if you want to prove that this transformation is only defined at b(X) = 0, then you will need to prove two things: E[b(X)] = 0 and VAR[b(X)] = 0. If you prove these two things then you will show that without a doubt the only value that will suffice will be at the point b(X) = 0.

Now your grad(f) . b = 0 implies that b is perpendicular to the gradient of f for all f. Now I haven't done vector calculus in a long time but I think that for this to happen then grad(b) . b = 0 since f includes b. This implies that f = 0. (I could be wrong so please correct me if I am).

So if f is the zero function then that means everywhere will be zero where b(X) maps to 0 for every value and hence E[b(X)] = 0 and VAR[b(X)] = 0 which means that P(b(X) = 0) should be one as you suggested.

I'm interested if any of my reasoning is wrong and again I haven't taken vector calculus in a long time.
 
  • #3
The fact that E[b(X)] = 0 is easy, just take f=x_i for each i it is ok. However, I am not sure how to prove Var[b(X)] = 0.

I am not sure what is meant by grad(b).b=0 since b is a vector field, and grad is only defined for scalar functions.
 
  • #4
liguolong said:
The fact that E[b(X)] = 0 is easy, just take f=x_i for each i it is ok. However, I am not sure how to prove Var[b(X)] = 0.

I am not sure what is meant by grad(b).b=0 since b is a vector field, and grad is only defined for scalar functions.

This is what I mean by grad(f)

http://en.wikipedia.org/wiki/Del#Gradient
 
  • #5
Ok, what I mean was f doesn't include b. Because b is a vector field. And grad only maps scalar functions to vectors, but it doesn't map vector functions to anything. Or do you mean E[grad(b_i).b]=0? In that case I don't know why it follows b=0.
 
  • #6
It's nearly midnight here so I'll respond tomorrow if need be.
 
  • #7
Ok, I am doing the wrong thing. I have found a counter example and what I am trying to prove makes absolutely no sense.
 
  • #8
liguolong said:
Ok, what I mean was f doesn't include b. Because b is a vector field. And grad only maps scalar functions to vectors, but it doesn't map vector functions to anything. Or do you mean E[grad(b_i).b]=0? In that case I don't know why it follows b=0.

The reason for the statement above is that f must include b since f includes all of your smooth functions of which b is also one of these. This was the motivation for my argument. From this I conjectured that b must be the zero function.
 

1. What is the definition of the orthogonal complement of a gradient field?

The orthogonal complement of a gradient field is the set of all vectors that are perpendicular to the gradient field. In other words, it is the set of all vectors that have a zero dot product with the gradient vector at every point in space.

2. How is the orthogonal complement of a gradient field related to the divergence of the field?

The divergence of a gradient field is always zero, which means that the gradient field is always perpendicular to its orthogonal complement. This is because the dot product of the gradient vector and any vector in the orthogonal complement is equal to zero.

3. Can you give an example of a gradient field and its orthogonal complement?

One example of a gradient field is the electric field around a point charge. Its orthogonal complement would be the set of all vectors that are perpendicular to the electric field lines, such as the magnetic field around the same point charge.

4. How do you find the orthogonal complement of a gradient field?

The orthogonal complement of a gradient field can be found by taking the cross product of the gradient vector and any vector in the field. This will give a vector that is perpendicular to both the gradient vector and the original vector, which is in the orthogonal complement.

5. What is the significance of the orthogonal complement of a gradient field in physics?

The orthogonal complement of a gradient field can help us understand the relationship between different physical quantities. For example, in electromagnetism, the magnetic field is the orthogonal complement of the electric field, and the two are closely related through Maxwell's equations. Additionally, understanding the orthogonal complement can help us solve problems involving vector fields and their behaviors.

Similar threads

Replies
3
Views
1K
Replies
1
Views
759
Replies
3
Views
1K
Replies
4
Views
2K
Replies
16
Views
2K
Replies
24
Views
2K
  • Calculus
Replies
20
Views
3K
Replies
3
Views
1K
Replies
1
Views
933
Back
Top