An interesting relevant question is whether you can measure the temperature of a cold object without being in equilibrium with it. I believe this can be achieved by observing the flow of radiant energy from a warm object that is absorbed by the cold object, the "cooling effect" of the cold object (this is similar to the opposite done with a hot object out of equilibrium, where one may measure the radiation emitted by the object. This is routinely done without of equilibrium warm objects, including in work I have been involved in in the past). In simple terms, the surface of something pointing towards a cold object is lowered in temperature, because there is a net flow of radiant energy away from the warmer object from the surface. This lowering in temperature can at least in principle be measured and the temperature of the cold object inferred. (Note that since a lens that is warmer than the cold object would wreck the measurement, the cold object needs to be very large in the field of view for this to have a chance of working unless you can cool a lens sufficiently).
[EDIT: thinking a little more on this, it is clear the correct way to do it would be to actively cool the detector surface as much as possible. This increases the signal to noise ratio. This is a technique that is already important in astronomy for observations in longer wavelengths].
As a result, I would respectfully take the position that the classical black hole model does have a temperature and that temperature is absolute zero. If a classical black hole could exist, this is the temperature that would be measured in isolation by a version of the above procedure. Given that the Hawking temperature of a 1 stellar mass black hole is 0.00000006 Kelvin, a very delicate experiment would be needed to detect the difference, even if you could find a black hole in perfect isolation. Such an experiment might sound impractical, but so did observing gravitational waves to Einstein.