# Relative eye safety of infrared and ultraviolet lasers

roam
My question is about the relative eye safety of infrared (##\lambda \gtrsim 1400 \ nm##) and ultraviolet (##\lambda \lesssim 300\ nm##) lasers.

Both of these wavelengths are highly absorbed by the pre-retinal water content in the cornea, so they don't really penetrate into the retina. So one would expect similar biological effects. However, according to this, Maximum Permissible Exposure (##\text{MPE}##) is significantly larger for infrared than it is for UV. Why is that?

From a physics standpoint, absorption in the UV is due to electronic transitions while in the infrared it is due to viberational states. However, when making spectrophotometric measurements, absorptance in both regions is approximately the same. So why does a UV beam of equal power cause more adverse biological effects?

Any explanation would be greatly appreciated.

## Answers and Replies

Staff Emeritus
2021 Award
Probably for the same reason UV can give you sunburn - higher energy per photon.

Cryo and roam
Both of these wavelengths are highly absorbed by the pre-retinal water content in the cornea, so they don't really penetrate into the retina.

That is manifestly untrue. First, ultraviolet radiation is primarily absorbed by the lens, with some absorption occurring in the aqueous humor, prior to the lens. Infrared radiation is largely absorbed by the cornea, but I don't have any data for wavelengths longer than 2.5 microns.

Regarding the figure you linked to: I am highly suspicious of that, since it apparently does not agree with my copy of ANSI Z136.1 2007, I refer you to Tables 5a and 5b here:

http://www.ehs.ucsb.edu/files/docs/rs/lasersaftyman.pdf

which seems to be a direct copy of the corresponding table in the ANSI document.

In summary, I would not characterize the MPR for UV sources as 'less than' the MPE for infrared sources: there are too many confounding effects to account for (exposure duration, source size, type of damage, etc.)

Cryo, roam and berkeman
roam
Hi Andy Resnick,

That is manifestly untrue. First, ultraviolet radiation is primarily absorbed by the lens, with some absorption occurring in the aqueous humor, prior to the lens. Infrared radiation is largely absorbed by the cornea, but I don't have any data for wavelengths longer than 2.5 microns.

My question is about UV-C (<300 nm). This source states that UV-A wavelengths are mostly absorbed in the lens of the eye, whereas UV-B and UV-C are absorbed by the cornea of the eye (just like IR).

Regarding the figure you linked to: I am highly suspicious of that, since it apparently does not agree with my copy of ANSI Z136.1 2007, I refer you to Tables 5a and 5b here:

http://www.ehs.ucsb.edu/files/docs/rs/lasersaftyman.pdf

The safety standard that you linked to gives the following formula for both UV-C (<300 nm) and IR:

$$\text{MPE}=\underline{0.56t^{0.25}\ J/m^{2}}\ \text{(durations 1 ms to 10 s)}. \tag{1}$$

This is considerably larger than the MPE given for visible and NIR.

IR lasers (for instance: 1.4 to 2 μm) are sometimes called "eye-safe" because the threshold or MPE required to cause damage in the absorption site is higher, and the type of damage caused is not as catastrophic (i.e. they are retina safe).

So, is it possible to also classify UV-C wavelengths as "eye-safe" (in the same sense that we used the term for IR)?
Eqn. 1 definitely suggests that both UV-C and IR are considerably more "eye-safe" than visible and NIR. Isn't that right?

P. S. I am not sure whether UV-C and IR really have the same level of eye hazard. But the skin effects of Ultraviolet C are more severe than the simple burns caused by IR (probably due to the higher energy per photon, as Vanadium 50 pointed out). This includes erythema (sunburn) and skin cancer.

So, is it possible to also classify UV-C wavelengths as "eye-safe" (in the same sense that we used the term for IR)?

Yes, it is possible to have Class 1 UV and IR sources; for example a quadrupled Nd:YAG CW laser that outputs less than 9.6 *10^-9 W or a single-pulse CO2 source that outputs a 100 ns pulse with less than 7.9*10^-5 J.

Getting back to your original question regarding relative thresholds of IR and UV, there are too many confounding effects to make a simple direct comparison: not just the methods that damage thresholds are determined, but the damage mechanisms (thermal and nonthermal) and timescales: burns are instantaneous, but cataracts and cancer are not.

roam
Yes, it is possible to have Class 1 UV and IR sources; for example a quadrupled Nd:YAG CW laser that outputs less than 9.6 *10^-9 W or a single-pulse CO2 source that outputs a 100 ns pulse with less than 7.9*10^-5 J.

I was talking about high-powered lasers (class IV). A high-powered IR laser like Er:YAG (2.9 μm) is more "eye-safe" than an equally high-powered NIR laser such as Nd:YAG (1.06 μm). To be specific, I am only considering thermal effects from continuous-wave lasers (timescales: 1 ms to 10 s).

In my project, I am working with a 225 nm UV laser. For these timescales, it has the same MPE as an IR laser. So would it be possible to argue that, just like IR lasers, this UV laser is "safer" than a visible/NIR laser?

In other words, if a UV and IR laser have the same MPE, does that mean the two lasers have approximately the same level of eye safety/risk?

I was talking about high-powered lasers (class IV).

Class 4 sources are *never* eye-safe. I would not try to argue otherwise.

berkeman
roam
Class 4 sources are *never* eye-safe. I would not try to argue otherwise.

I wasn't saying that they are eye-safe at all (that's why I used quotation marks). I was simply saying some class IV wavelengths are less dangerous than others. Having a higher MPE means you have a smaller hazard zone:

$$\text{Hazard Zone}=\left(\frac{1}{\text{Divergence}}\right)\left[\sqrt{\frac{4\mathbf{P}}{\pi\text{MPE}}}-\text{Initial diameter}\right].$$

It appears that two lasers with different wavelengths (but identical irradiation parameters) will have the same degree of hazard associated with them if they have the same MPE. Isn't this true?

In particular, if a UV and IR laser have the same power and divergence, for a 1 s exposure time, can we argue that they both have the same level of hazard?

In particular, if a UV and IR laser have the same power and divergence, for a 1 s exposure time, can we argue that they both have the same level of hazard?

Not if the damage threshold is different for the two wavelengths.

roam
Not if the damage threshold is different for the two wavelengths.

But if both wavelengths have the same MPE for a given duration, doesn't that imply that the damage thresholds are about the same?

By the way, the tissue in both cases is the cornea where IR and UV-C get absorbed...

But if both wavelengths have the same MPE for a given duration, doesn't that imply that the damage thresholds are about the same?

By the way, the tissue in both cases is the cornea where IR and UV-C get absorbed...

We seem to be going around in circles at this point. What, exactly, are you trying to better understand?

Cutter Ketch
So, is it possible to also classify UV-C wavelengths as "eye-safe" (in the same sense that we used the term for IR)?
Eqn. 1 definitely suggests that both UV-C and IR are considerably more "eye-safe" than visible and NIR. Isn't that right?

This is unfortunate and dangerously misleading terminology, but you are correct. As a 30 year designer of laser-based electro-optical sensors for government customers I can affirm that the entire community does refer to wavelengths with higher MPE as “eye-safe wavelengths” and that includes IR and UV. Realizing the error we are making and the danger we are causing we try to say “eye-safer” wavelengths, but invariably back-slide. However, we never refer to a particular system as “eye-safe” unless it is genuinely class 1. This makes for some interesting linguistic gymnastics when discussing plans for non-eyesafe lasers at eye safer wavelengths. Anyhow, we would certainly all be better off if we would not use “eye-safe” as a label for a wavelength range. As the horrible terminology seems to be confusing this discussion, we should avoid it for the remainder.

Now, regarding your original question, from previous posts it seems like you have your answer. Your premise was that UV and IR had different MPE, but I see in a subsequent post you found the standard actually gives the same MPE. I don’t have the standard in front of me, but I’ll take your word for it. Same absorption location (cornea) same MPE, so I presume question resolved, correct?

Now going further afield, we should recall MPE is approximately related to damage threshold, but they couldn’t really specify every wavelength precisely. I am sure a very accurate plot of damage threshold vs wavelength would have lots of structure from particular absorption’s, and MPE puts a rough envelope on that (with plenty of margin). As you all have been discussing the biggest effect on damage threshold is where the energy is absorbed. Because of focusing, the retina is obviously the worst case, but as others noted there is a difference between absorbing at the cornea or absorbing at the lens.

However, I am surprised there isn’t more variation by wavelength. That the MPEs are the same for UV and IR indicates that the damage is primarily thermal. The original absorption mechanism doesn’t seem to matter. The energy thermalizes and accumulates until the temperature leads to damage. However I happen to know the mechanism does matter. I know of a particular wavelength which corresponds to a stretch mode of amide protein which efficiently unzips tissue. You would think that that one wavelength alone would need its own special MPE, but it certainly isn’t singled out. Also,there are bound to be other bad wavelengths. So I think the people who wrote the eye-safety standard didn’t necessarily know about every absorption, and that is why you want margin!

roam