Ayame17
- 43
- 0
I'm currently working on my final year project, and one of the little bits to do is to see if certain data points fall within a circle with my own defined radius and central co-ordinates. I've been given the equation to use:
d = \frac{\sqrt{(l-l_{0})^2 + (b-b_{0})^2}}{r}
where l_{0} and b_{0} are my central galactic co-ordinates, r is my radius (in degrees) and d is the distance the data point is from the central co-ordinates (also in degrees) - if d is less than 1, then the point is within my radius.
Although I don't need to, I'd just like to know where the equation comes from! I can see how the right hand side of the equals is a rearrangement of the equation of a circle, but I don't see how the distance is put in. Any help is appreciated!
d = \frac{\sqrt{(l-l_{0})^2 + (b-b_{0})^2}}{r}
where l_{0} and b_{0} are my central galactic co-ordinates, r is my radius (in degrees) and d is the distance the data point is from the central co-ordinates (also in degrees) - if d is less than 1, then the point is within my radius.
Although I don't need to, I'd just like to know where the equation comes from! I can see how the right hand side of the equals is a rearrangement of the equation of a circle, but I don't see how the distance is put in. Any help is appreciated!