Yes and no. The sensor/detector receives radiation from some solid angle in front of it, forming an incoming cone that is projected onto the sensor. As long as the 'base' of the cone is completely filled by the surface that you are wanting to measure then distance is irrelevant. This is just like how the Sun (or any other star) is the same brightness per unit of area no matter your distance from it. Moving further away doesn't change the brightness of the unit areas, it just makes the Sun smaller, resulting in fewer unit areas overall.
However, if the base of your cone is larger than your surface, then you run into problems. For example, if you try to measure the temperature of a lit match head from 50 feet away, almost all of the cone falls upon the background, and you will get a lower temperature than you should (assuming the background is cooler than the match head of course).
In other words, if your surface fills the field of view projected onto the detector, then it's fine. If the surface is smaller than the field of view you will get a lower value than you should.
That's my understanding at least.