I am doing a small project on Gamma Ray Bursts. I am looking at a data set of around a dozen bursts which are observed at an angular offset from their host galaxies (My theory is that they are in dwarf galaxies orbiting the host galaxy). Say the average separation of these GRBs is 4 arcseconds from the hosts. This will not be the real separation because the observations could have a line of sight component of separation which is unobservable. I am looking for a scale factor to multiply the observed quantities by. At first I thought the observations would be scaled down by a factor of the average of sin x over a half cycle. This corresponds to the average of the angle which the burst can be to the plane perpendicular to the line of sight. Then I though of the orbit as a circle somewhere around a sphere. I believe there are two separate axes of rotation for this circle which would decrease our observed separation. Leading to a downsizing factor of the average of (sin x)^2 over a half cycle. I think the first one is correct but I have confused myself in the visualization. A convincing answer would be great.