Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Chief Ray Angle and Field of View

  1. Mar 11, 2017 #1
    Hi all!
    I was studying the datasheet of a CMOS sensor when I bumped into a graph which defined the CRA (Chief Ray Angle) of a lens as function of the FoV (Field of View).

    cdn.sparkfun.com/datasheets/Dev/RaspberryPi/ov5647_full.pdf page 134

    I've understood what is the CRA, and I've got the general idea of FoV even if not precisely. Thus I haven't understood yet the precise difference between the two.

    https://ocw.mit.edu/courses/mechani...ndows-single-lens-camera/MIT2_71S09_lec06.pdf slide 8

    According the the presentation above, their concept of FoV is analogous to what I know CRA is, the maximum angular aperture from the center of the lens. What does the graph mean then? It seems logical to me that the two quantities should be linerly proportional, as at the beginning of the graph; but I cannot understand how can CRA saturate (and diminish too), while FoV increases.

    Moreover, what does image height mean in the table below the graph?

    Thank you very much indeed
     
    Last edited by a moderator: Mar 16, 2017
  2. jcsd
  3. Mar 16, 2017 #2
    Thanks for the thread! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post? The more details the better.
     
  4. Mar 16, 2017 #3
    I haven't read all of the first reference you linked to, but as it's the datasheet for a sensor and not an optical system, what is it they are showing the FOV of?

    I agree with your understanding of the FOV being defined by the chief ray angle, so normally we would expect a linear relation between the CRA and the FOV when the latter is expressed as an angle. But the legend on the plot appears to be the fraction of the full FOV, maybe expressed as a distance. So maybe there's a tangent involved which would make the relationship nonlinear. But why would a sensor have a FOV?
     
  5. Mar 16, 2017 #4

    Drakkith

    User Avatar

    Staff: Mentor

    The "Key Specifications" sections includes two bullet points for lens size and lens chief ray angle, so I assume it either includes a lens assembly or is designs for a specific one?

    Hmm... perhaps they're talking about the CRA after it passes through an optical system? If so, I'm not sure how'd you'd relate the two without know the system details.

    The image height is just the radial size of image formed at the focal plane. You can think of it as the radius of the full image formed by the optical system. It can also represent the size of the image of a single object. If you're imaging, say, a tree, the height of the tree's image is just the radial size of the image at the focal plane.
     
  6. Mar 17, 2017 #5

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    Yikes- those PDF slides 5-11 are 'yuck'. Let's start from the beginning: when analyzing an optical system with ray tracing, you need at least 2 rays. For convenience, 2 rays commonly used are the Chief ray and Marginal ray. These rays are defined by what they do at the aperture stop: the marginal ray passes through the edge of the aperture stop, while the chief ray passes through the center of the aperture stop.

    The entrance and exit pupils are projections of the aperture stop into object and image space, thus the chief and marginal rays behave the same at those pupil planes.

    The field stop, on the other hand, is generally unrelated to the aperture stop. Certainly, there are imaging methods where they have specific relationships (Kohler imaging, for example), but in general the field stop is an independent element. It is true that the field stop and also, the size of image plane define the angular field of view.

    How's that so far?
     
    Last edited: Apr 7, 2017
  7. Apr 7, 2017 #6
    Sorry for not answering you, but after some days I thought the thread was dead, and forgot about this. My fault.

    It's a CMOS sensor plus a focusing lens in front of it.


    More infos about the lens are here: http://www.truetex.com/raspberrypi

    These are my qualitative conclusions: CRA in the datasheet is image (sensor) side, while the FOV is object side. As the CRA increases, the FOV linearly increases with it, but because of lens aberrations and too big rays slope upon the photosites, the FOV appears distorted at the edges, and thus saturates as CRA reaches 30°degrees.
     
  8. Apr 7, 2017 #7

    Drakkith

    User Avatar

    Staff: Mentor

    After talking with one of the senior's here at my school, I think it is possible for the chief ray angle in image space to begin to decrease as the FOV gets large without it being an aberration. But we're just students, so we can't be sure.
     
  9. Apr 11, 2017 #8
    He said that it is possible, but did he give any reasons?
     
  10. Apr 11, 2017 #9
    Wouldn't any deviation from the paraxial image location be an aberration by definition?
     
  11. Apr 11, 2017 #10

    Drakkith

    User Avatar

    Staff: Mentor

    Not really, sorry.

    I think so, but I'm not sure if this results in a deviation from the paraxial image location or not.
     
  12. Apr 12, 2017 #11
    CRA is the term used primarily for the image sensors that have a microlens array on them (to eliminate the gaps between the physical pixels). Typically, the pitch of the lenslets is changing, so that the lenslets near the edges of the sensor are shifted closer to the center of the sensor (the pitch decreases).
    Field of view is the term describing the system containing the imaging lens and the sensor. On a very basic level, it is the whole angle (in the object space) spanned by the imaging array.
    If there were no lenslet array, then the edge pixels would define the FOV (just draw the two extreme chief rays - the rays passing through the center of the pupils and hitting centers of the edge pixels - and find the angle between them). With the lenslet array, the situation is changed (but only very slightly)
    If the imaging lens were designed exactly for the sensor specifications, then the FOV would be directly connnected to CRA (FOV = 2*CRA). In real life it is not really possible, and the relation is approximate.
     
  13. Apr 12, 2017 #12
    Instead of talking about FOV and CRA, let's talk instead about an object and an image, the aforementioned angles being related to these through the tan function. I believe in paraxial optics that any increase in the object height results in a corresponding increase the image height, through the relationship of the system's magnification. I can't think of a situation in paraxial optics where if you keep increasing the object height, the image height will increase up to a point and then start decreasing. Thus, if the image height does start to decrease, then it is deviating from the expected paraxial result and this would be attributed to aberrations.
     
  14. Apr 12, 2017 #13

    Drakkith

    User Avatar

    Staff: Mentor

    Interesting. I guess that would explain why ray info is included in a sensor data sheet.
     
  15. Apr 12, 2017 #14
    Do you have a reference for this? My understanding is that the lenslets are centered on each receptor in the sensor and those receptors are regularly spaced.
     
  16. Apr 15, 2017 #15
    Thank you for your contribution. Anyway I'd like to quote @pixel question.

    Also I knew that photosites (I'd rather use this term than pixels), are equally spaced (see page 25 of the datasheet in the first post). Why would they be different at the edges?

    p.s. I'm not native english. For pitch of lenslet do you mean their inclination related to the normal?
     
  17. Apr 26, 2017 #16
    Could you please suggest me any book/article/report/whatsoever that talks about image height-cra-fov relation in a photographic enviroment (when there is a sensor after the lens)? Because I should write a work and I would need "official" references.
    Moreover I would kindly ask @Greg123 if he could provide some reference to what he said, I would be very grateful.

    Unfortunately my reference professor teaches photonics and has optics knowledge but aimed towards optics communications not photographic optics.

    Thank you very much indeed, and sorry but this thing is not-trivial at all :/
     
  18. Apr 26, 2017 #17

    Drakkith

    User Avatar

    Staff: Mentor

    Are you asking about how the chief ray is related to image height at the sensor (aka the image plane) and not in the context of microlens arrays? If so, then literally any decent book on Geometrical Optics should go into this in detail.

    It's a little difficult to give you a reference because I have no idea how in-depth you want to go. Plenty of PDF's and slideshows can be found online that talk about the absolute basics, but I have yet to find one that goes into detail and teaches you everything you need to know if you want to be able to do all the calculations and such. If you don't need to learn this, then any of those should do just fine.
     
  19. Apr 27, 2017 #18
    I'd need them related to sensors, and even better related to lenslets arrays. I understood what a CRA or a FOV is, even though I had to seize infos from several web sites, nevertheless it would be nice to have a nice article or book that explain microlenses arrays, that I can use as reference in my work.

    Let's say it's not fundamental for the work, but I'm curious, and it drives me mad the fact that I cannot get to a precise quantitative solution about the saturation (and even the decrease) of CRA as FoV increases. It seems that as the object angle increases, the sensor side angle decreases or stays fixed. If I ever published a scientific work I would write a minimal explanation to what I'd show.
     
  20. Apr 27, 2017 #19
    Maybe try contacting OmniVision and asking them about it.
     
  21. Apr 27, 2017 #20

    Drakkith

    User Avatar

    Staff: Mentor

    Usually you can just assume that the sensor is placed at the focal plane of the system and the chief ray gives the image height at the focal plane (and thus the sensor). If you want to get into more details with regard to lenslets and how they affect things then I can't help you, as that is well beyond my knowledge level at the moment. About the best info I could find was from page 8 on this pdf: http://www.onsemi.com/pub/Collateral/NOIP1SN025KA-D.PDF
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Chief Ray Angle and Field of View
  1. Field of view enquiry (Replies: 2)

  2. LCD view angle (Replies: 4)

  3. Wider viewing angles (Replies: 5)

Loading...