- #1
Ayame17
- 44
- 0
I'm currently working on my final year project, and one of the little bits to do is to see if certain data points fall within a circle with my own defined radius and central co-ordinates. I've been given the equation to use:
[tex]d = \frac{\sqrt{(l-l_{0})^2 + (b-b_{0})^2}}{r}[/tex]
where [tex]l_{0}[/tex] and [tex] b_{0}[/tex] are my central galactic co-ordinates, r is my radius (in degrees) and d is the distance the data point is from the central co-ordinates (also in degrees) - if d is less than 1, then the point is within my radius.
Although I don't need to, I'd just like to know where the equation comes from! I can see how the right hand side of the equals is a rearrangement of the equation of a circle, but I don't see how the distance is put in. Any help is appreciated!
[tex]d = \frac{\sqrt{(l-l_{0})^2 + (b-b_{0})^2}}{r}[/tex]
where [tex]l_{0}[/tex] and [tex] b_{0}[/tex] are my central galactic co-ordinates, r is my radius (in degrees) and d is the distance the data point is from the central co-ordinates (also in degrees) - if d is less than 1, then the point is within my radius.
Although I don't need to, I'd just like to know where the equation comes from! I can see how the right hand side of the equals is a rearrangement of the equation of a circle, but I don't see how the distance is put in. Any help is appreciated!