I have a fisheye lens which sees 180 degrees of the sky. It registers the entire sky on a 640x480 grid of pixels. Let's assume I know where the zenith and north are on the plate. That allows me to find the alt/az of every pixel in the circular image. The fisheye is linear in the distance from the zenith out to the edge of the image. How do I compute the ra/dec of each pixel in the image? I know the time and date the image was taken down to 1 second of accuracy.