# Nanoscale Laser Heating

I am doing some basic heat calculations about a CW laser illuminated tungsten tip (which should be used for photoemission) - I started to play with the tip geometry to try and find a way to get more flux Watts/metre^2 onto the tip with out it melting. I found that after solving the 1d heat equation for a cone and then including a rod on the other end (held at T=300K):

{dT=\frac{Q}{k*A(z)}dZ=\Dint{T_{facet}}{T_{shaft}}=\Dint{Z_{facet}}{Z_{shaft}} \frac{Q}{kz^{2}tan^{2}\theta\pi}dZ}
which comes out as

This assumes (wrongly of course) that all the laser power is absorbed on only the facet of the tip.

Now my calculations tell me that if I make the facet area around 10 nm i can apply 10^12W/m^2.

Is this possible? Am I not doing something that should ablate my tip immediatly ?

Related Classical Physics News on Phys.org
I might be completely on the wrong track, but - would it not cause problems if the object you want to heat is so much smaller than the wavelength of the light you are using ?

If it is visible light, we are talking of ca. 500nm, and lets say 1000nm if you use near infrared. So your object at 10nm is about 100 times smaller, have you considered this in your calculations ?

I don't have anything in the calculations, but I am pretty sure that we get quite a strong local electric field enhancement, that's all good (as long as it doesn't blow up the tip). There might also be some plasmon enhancement I think (but really not sure , I read that tungsten doesn't have a strong plasmon enhancement).

If there is an increase in the Power Density then that's not important (I can just supply less power initially to compensate). I am more worried about the penetration depth of the photons into the tip and what that means to the tip on such a small scale?

Can you suggest anything other things to consider?