Thank you for your replies, I finally got it to work. I ran the simulations with rays coming from infinity instead of rays
exiting from the object (right to left instead of left to right on figure) and I computed the density of intersection points with the object using Gaussian kernel density...
Probably the distortion will only change by couple of percents, but for big fields of view (~90 degree), the distortion is big and need to be compensated by an algorithm, it would be optimal to have quantitative values for out-of-focus distortion so the algorithm can take that in consideration.
I'm trying to simulate light rays going through an eyepiece to characterize its magnification in function of the distance from the optical axis (image distortion). Attached is a figure showing the simulation console.
Some lenses in the eyepiece are movable to correct for myopia/hypermetropia...