I Random Unit Vector Angle Difference

202
35
I am simulating random angles from 0 to 2π with a uniform distribution. However, if I take the differences between random angles, I get a non-uniform (monotonically decreasing) distribution of angles.

In math speek:
Ai = uniform(0,2π)
dA = Ai - Aj
dA is not uniform.

Here is a rough image of what i'm seeing. P is probability density:
upload_2019-3-7_12-10-41.png



This does not make sense to me. As it seems to imply that the difference between random angles is more likely to be 0 than to be non-zero. You would think it would be uniform, as one angle can be viewed as the *zero* and the other as the random angle. So dA seems like it should also be uniform. What is going on here?
 

Attachments

StoneTemplePython

Science Advisor
Gold Member
1,069
508
I have a feeling there's a problem with different coordinate systems (cartesian vs polar in particular) here and what it means to be "uniform at random". Can you explain how you are generating these 2-d vectors?

The standard approach in rectangular coordinates, for uniform at random sampling is to assume WLOG that your first vector is ##c \mathbf e_1## i.e. the 1st standard basis vector (with ##c\gt0## to normalize as needed)... i.e. for any first vector sampled you can select an orthonormal basis / set your rectangular coordinate system with it as an axis. Then sample your second vector and tease out the angle with an inner product.
 
Last edited:
202
35
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.
 

PeroK

Science Advisor
Homework Helper
Insights Author
Gold Member
2018 Award
9,402
3,431
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.
If you choose two numbers in an interval, ##[0, 2\pi]## in this case, then unless one number is close to ##0## and the other number is close to ##2\pi##, you can't get a difference close to ##2\pi##.

Also, in principle, your new distribution could be on the interval ##[-2\pi, 2\pi]## depending on how you measure the difference.
 

StoneTemplePython

Science Advisor
Gold Member
1,069
508
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.
Got it -- so its sampling from a real interval ##[0,2\pi]## uniformly at random. Up to rescaling, we could just call it ##[0,1]## and ignore any mention of angles, right?
 
202
35
Silly me, yes you can just forget about it being angles. Uniform distribution sample - uniform distribution sample = non-uniform sample. Still not sure why this is.
 

PeroK

Science Advisor
Homework Helper
Insights Author
Gold Member
2018 Award
9,402
3,431
Silly me, yes you can just forget about it being angles. Uniform distribution sample - uniform distribution sample = non-uniform sample. Still not sure why this is.
Suppose we had a bet. You bet on a difference of ##3\pi/2## and I bet on ##\pi/4##. For you to win, your first number must be in the range ##< \pi/2## or ##> 3\pi/2##. That's only a 50% chance. But, my first number could be anywhere and I'm still in the running.

Also, for the second number, you only have one possibility. If your first number is low, your second number must be high; or vice versa. Whereas, I've got a good chance of having two possibilities, one higher and one lower than my first number.
 

StoneTemplePython

Science Advisor
Gold Member
1,069
508
so you want the distribution of ##\big \vert U_1 - U_2\big \vert##

this is a classic problem of sketching things out -- i.e. draw a rectangle with corners ##[0,0], [0,1], [1,0],[1,1]## and and draw a line from [0,0] to [1,1] (call it the anti-diagonal) -- you are looking at the (symmetric) result of going from the anti-diagonal to one of the vertical bars on your box.

- - - - -
edit:
(re-done, to cleanup the CDF approach)
My suggested approach to get the CDF of ##V := \big \vert U_1 - U_2\big \vert##
##U_1, U_2## are both iid uniform r.v.'s in [0,1]

we want to compute
##F_V(c)= P\Big(\big \vert U_1 - U_2\big \vert \leq c\Big)##

but instead consider the complementary CDF given by
##\bar{F}_V(c) = 1 - F_V(c) = 1- P\Big(\big \vert U_1 - U_2\big \vert \leq c\Big)##
but interms of underlying events,
##\bar{F}_V(c) = P\Big(\big \vert U_1 - U_2\big \vert \gt c\Big) = P\Big( U_1 - U_2 \gt c\Big) + P\Big( U_1 - U_2 \lt -c\Big) = P\Big( U_1 - U_2 \gt c\Big) + P\Big( U_1 - U_2 \leq -c\Big)##
where mutually exclusive events add, and then the strictness of the inequality can be ignored due to zero probability of a tie. So we need
##(\text{i}) P\Big( U_1 - U_2 \gt c\Big)##
##(\text{ii}) P\Big( U_1 - U_2 \leq -c\Big)##

for (i)
##P\Big( U_1 - U_2 \gt c\Big) = P\Big( U_1 \gt U_2 + c\Big) = 1 - P\Big( U_1 \leq U_2 + c\Big)##
but
##P\Big( U_1 \leq U_2 + c\Big) = \big(\int_0^{1-c} F_{U_1}(u_2 + c)\cdot dF_{U_2}(u_2) \big)+ \int_{1-c}^1 1 \cdot dF_{U_2}(u_2) = \big(\int_0^{1-c} F_{U_1}(u_2 + c)\cdot d u_2\big)+ c ##

for (ii)
##P\Big( U_1 - U_2 \leq -c\Big) = P\Big(U_1 \leq U_2 -c\Big) = \big(\int_0^c 0 \cdot dF_{U_2}\big) + \int_c^1 F_{U_1}(u_2 - c) dF_{U_2} = \int_c^1 F_{U_1}(u_2 - c) d u_2 ##
 
Last edited:

Ray Vickson

Science Advisor
Homework Helper
Dearly Missed
10,705
1,719
I am simulating random angles from 0 to 2π with a uniform distribution. However, if I take the differences between random angles, I get a non-uniform (monotonically decreasing) distribution of angles.

In math speek:
Ai = uniform(0,2π)
dA = Ai - Aj
dA is not uniform.

Here is a rough image of what i'm seeing. P is probability density:
View attachment 239875


This does not make sense to me. As it seems to imply that the difference between random angles is more likely to be 0 than to be non-zero. You would think it would be uniform, as one angle can be viewed as the *zero* and the other as the random angle. So dA seems like it should also be uniform. What is going on here?
It is easy enough to work out the distribution of the difference ##A_i - A_j## or ##|A_i - A_j|.## As other responders have done, let us change the problem to one of uniform distributions over ##[0,1].## If ##X_1## and ##X_2## are independent and Unif(0,1), the density of their difference ##D = X_1 - X_2## is far from uniform. In fact, ##Y = D+1## is "familiar", because ##Y = X_1 + (1-X_2) = X_1 + X_2'##, where ##X_2' = 1-X_2## is independent of ##X_1## and has distribution Unif(0,1). Thus, ##Y## has the distribution of a sum of uniforms, so has a triangulare density function. To get the density function of ##D## we need only shift that of ##Y## by one unit to the left, so the density function of ##D## is
$$f_D(d) = \begin{cases}1+d,& -1 \leq d \leq 0\\
1-d,& 0 \leq d \leq 1 \\
0 & \text{otherwise}
\end{cases}
$$
The density of ##M = |D|## is
$$ f_M(m) = f_D(m) + f_D(-m) = \begin{cases}2(1-m) & 0 \leq m \leq 1\\
0 & \text{otherwise}
\end{cases}$$
So ##|X_1-X_2|## does, indeed, have a downward-sloping density, highest near 0 and dropping to 0 near 1.

For more about the "triangular" distribution of a sum, just Google "distribution of a sum of uniform random variables".
 

Want to reply to this thread?

"Random Unit Vector Angle Difference" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top