- #1
roam
- 1,271
- 12
My question is about the relative eye safety of infrared (##\lambda \gtrsim 1400 \ nm##) and ultraviolet (##\lambda \lesssim 300\ nm##) lasers.
Both of these wavelengths are highly absorbed by the pre-retinal water content in the cornea, so they don't really penetrate into the retina. So one would expect similar biological effects. However, according to this, Maximum Permissible Exposure (##\text{MPE}##) is significantly larger for infrared than it is for UV. Why is that?
From a physics standpoint, absorption in the UV is due to electronic transitions while in the infrared it is due to viberational states. However, when making spectrophotometric measurements, absorptance in both regions is approximately the same. So why does a UV beam of equal power cause more adverse biological effects?
Any explanation would be greatly appreciated.
Both of these wavelengths are highly absorbed by the pre-retinal water content in the cornea, so they don't really penetrate into the retina. So one would expect similar biological effects. However, according to this, Maximum Permissible Exposure (##\text{MPE}##) is significantly larger for infrared than it is for UV. Why is that?
From a physics standpoint, absorption in the UV is due to electronic transitions while in the infrared it is due to viberational states. However, when making spectrophotometric measurements, absorptance in both regions is approximately the same. So why does a UV beam of equal power cause more adverse biological effects?
Any explanation would be greatly appreciated.