Random Unit Vector Angle Difference

Click For Summary

Discussion Overview

The discussion revolves around the properties of differences between random angles generated uniformly from the interval [0, 2π]. Participants explore the resulting distribution of these differences, which appears to be non-uniform and monotonically decreasing, raising questions about the underlying reasons for this behavior.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant notes that simulating random angles from a uniform distribution leads to a non-uniform distribution of angle differences, which seems counterintuitive.
  • Another participant suggests that the issue may stem from the use of different coordinate systems, particularly between Cartesian and polar coordinates, and asks for clarification on how the 2D vectors are generated.
  • A participant clarifies that they are generating angles using a uniform random number generator and taking the differences between them, rather than treating them as vectors computationally.
  • It is pointed out that when choosing two numbers in the interval [0, 2π], the differences cannot be close to 2π unless one number is near 0 and the other near 2π.
  • Some participants discuss the implications of sampling from a real interval and how this affects the distribution of differences, suggesting that the differences should be treated as uniform samples resulting in a non-uniform sample.
  • A later reply introduces a mathematical approach to compute the cumulative distribution function (CDF) of the absolute difference between two independent uniform random variables, indicating that the resulting distribution is triangular.
  • Another participant provides a detailed derivation of the density function for the difference of two uniform random variables, showing that the density is highest near 0 and decreases towards 1.

Areas of Agreement / Disagreement

Participants express uncertainty about the reasons for the non-uniform distribution of angle differences. Multiple competing views exist regarding the interpretation of the problem, and the discussion remains unresolved regarding the underlying causes.

Contextual Notes

Participants mention the potential influence of coordinate systems and the nature of the random sampling process, but these aspects remain inadequately defined and unresolved.

DuckAmuck
Messages
238
Reaction score
40
I am simulating random angles from 0 to 2π with a uniform distribution. However, if I take the differences between random angles, I get a non-uniform (monotonically decreasing) distribution of angles.

In math speek:
Ai = uniform(0,2π)
dA = Ai - Aj
dA is not uniform.

Here is a rough image of what I'm seeing. P is probability density:
upload_2019-3-7_12-10-41.png
This does not make sense to me. As it seems to imply that the difference between random angles is more likely to be 0 than to be non-zero. You would think it would be uniform, as one angle can be viewed as the *zero* and the other as the random angle. So dA seems like it should also be uniform. What is going on here?
 

Attachments

  • upload_2019-3-7_12-10-41.png
    upload_2019-3-7_12-10-41.png
    1.7 KB · Views: 1,032
Physics news on Phys.org
I have a feeling there's a problem with different coordinate systems (cartesian vs polar in particular) here and what it means to be "uniform at random". Can you explain how you are generating these 2-d vectors?

The standard approach in rectangular coordinates, for uniform at random sampling is to assume WLOG that your first vector is ##c \mathbf e_1## i.e. the 1st standard basis vector (with ##c\gt0## to normalize as needed)... i.e. for any first vector sampled you can select an orthonormal basis / set your rectangular coordinate system with it as an axis. Then sample your second vector and tease out the angle with an inner product.
 
Last edited:
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.
 
DuckAmuck said:
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.

If you choose two numbers in an interval, ##[0, 2\pi]## in this case, then unless one number is close to ##0## and the other number is close to ##2\pi##, you can't get a difference close to ##2\pi##.

Also, in principle, your new distribution could be on the interval ##[-2\pi, 2\pi]## depending on how you measure the difference.
 
DuckAmuck said:
The vectors aren't *really* vectors computationally. I'm just generating angles using a uniform random number generator. Then taking the differences between them.
Got it -- so its sampling from a real interval ##[0,2\pi]## uniformly at random. Up to rescaling, we could just call it ##[0,1]## and ignore any mention of angles, right?
 
  • Like
Likes   Reactions: DuckAmuck
Silly me, yes you can just forget about it being angles. Uniform distribution sample - uniform distribution sample = non-uniform sample. Still not sure why this is.
 
DuckAmuck said:
Silly me, yes you can just forget about it being angles. Uniform distribution sample - uniform distribution sample = non-uniform sample. Still not sure why this is.

Suppose we had a bet. You bet on a difference of ##3\pi/2## and I bet on ##\pi/4##. For you to win, your first number must be in the range ##< \pi/2## or ##> 3\pi/2##. That's only a 50% chance. But, my first number could be anywhere and I'm still in the running.

Also, for the second number, you only have one possibility. If your first number is low, your second number must be high; or vice versa. Whereas, I've got a good chance of having two possibilities, one higher and one lower than my first number.
 
so you want the distribution of ##\big \vert U_1 - U_2\big \vert##

this is a classic problem of sketching things out -- i.e. draw a rectangle with corners ##[0,0], [0,1], [1,0],[1,1]## and and draw a line from [0,0] to [1,1] (call it the anti-diagonal) -- you are looking at the (symmetric) result of going from the anti-diagonal to one of the vertical bars on your box.

- - - - -
edit:
(re-done, to cleanup the CDF approach)
My suggested approach to get the CDF of ##V := \big \vert U_1 - U_2\big \vert##
##U_1, U_2## are both iid uniform r.v.'s in [0,1]

we want to compute
##F_V(c)= P\Big(\big \vert U_1 - U_2\big \vert \leq c\Big)##

but instead consider the complementary CDF given by
##\bar{F}_V(c) = 1 - F_V(c) = 1- P\Big(\big \vert U_1 - U_2\big \vert \leq c\Big)##
but interms of underlying events,
##\bar{F}_V(c) = P\Big(\big \vert U_1 - U_2\big \vert \gt c\Big) = P\Big( U_1 - U_2 \gt c\Big) + P\Big( U_1 - U_2 \lt -c\Big) = P\Big( U_1 - U_2 \gt c\Big) + P\Big( U_1 - U_2 \leq -c\Big)##
where mutually exclusive events add, and then the strictness of the inequality can be ignored due to zero probability of a tie. So we need
##(\text{i}) P\Big( U_1 - U_2 \gt c\Big)##
##(\text{ii}) P\Big( U_1 - U_2 \leq -c\Big)##

for (i)
##P\Big( U_1 - U_2 \gt c\Big) = P\Big( U_1 \gt U_2 + c\Big) = 1 - P\Big( U_1 \leq U_2 + c\Big)##
but
##P\Big( U_1 \leq U_2 + c\Big) = \big(\int_0^{1-c} F_{U_1}(u_2 + c)\cdot dF_{U_2}(u_2) \big)+ \int_{1-c}^1 1 \cdot dF_{U_2}(u_2) = \big(\int_0^{1-c} F_{U_1}(u_2 + c)\cdot d u_2\big)+ c ##

for (ii)
##P\Big( U_1 - U_2 \leq -c\Big) = P\Big(U_1 \leq U_2 -c\Big) = \big(\int_0^c 0 \cdot dF_{U_2}\big) + \int_c^1 F_{U_1}(u_2 - c) dF_{U_2} = \int_c^1 F_{U_1}(u_2 - c) d u_2 ##
 
Last edited:
DuckAmuck said:
I am simulating random angles from 0 to 2π with a uniform distribution. However, if I take the differences between random angles, I get a non-uniform (monotonically decreasing) distribution of angles.

In math speek:
Ai = uniform(0,2π)
dA = Ai - Aj
dA is not uniform.

Here is a rough image of what I'm seeing. P is probability density:
View attachment 239875This does not make sense to me. As it seems to imply that the difference between random angles is more likely to be 0 than to be non-zero. You would think it would be uniform, as one angle can be viewed as the *zero* and the other as the random angle. So dA seems like it should also be uniform. What is going on here?

It is easy enough to work out the distribution of the difference ##A_i - A_j## or ##|A_i - A_j|.## As other responders have done, let us change the problem to one of uniform distributions over ##[0,1].## If ##X_1## and ##X_2## are independent and Unif(0,1), the density of their difference ##D = X_1 - X_2## is far from uniform. In fact, ##Y = D+1## is "familiar", because ##Y = X_1 + (1-X_2) = X_1 + X_2'##, where ##X_2' = 1-X_2## is independent of ##X_1## and has distribution Unif(0,1). Thus, ##Y## has the distribution of a sum of uniforms, so has a triangulare density function. To get the density function of ##D## we need only shift that of ##Y## by one unit to the left, so the density function of ##D## is
$$f_D(d) = \begin{cases}1+d,& -1 \leq d \leq 0\\
1-d,& 0 \leq d \leq 1 \\
0 & \text{otherwise}
\end{cases}
$$
The density of ##M = |D|## is
$$ f_M(m) = f_D(m) + f_D(-m) = \begin{cases}2(1-m) & 0 \leq m \leq 1\\
0 & \text{otherwise}
\end{cases}$$
So ##|X_1-X_2|## does, indeed, have a downward-sloping density, highest near 0 and dropping to 0 near 1.

For more about the "triangular" distribution of a sum, just Google "distribution of a sum of uniform random variables".
 

Similar threads

  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
6K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 20 ·
Replies
20
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K