The biggest barrier to any concept of 'absolute time' is relativity. As soon as one clock moves relative to another, then each will have a different measure of time but NETHER WILL BE MORE "correct" than the other.
Gravity also causes this disparity since it can change the shape of a "straight line" which light follows invariantly in a set time. This speed of light is the arbiter of causality.
What we can do, though, is take an approximation across the entire Earth that all external influences (such as motion of the Earth around the sun and the gravitational pull of the sun, moon etc. - are all ignored under an assumption they are equal across every clock on the earth. The only consideratons taken into account, then, are the distance of a clock from the centre of the Earth and its motion relative to the centre of the earth.
Organisations such as NIST maintain an incredibly accurate atomic clock system* which takes an incredible number of factors into account (even introducing leap seconds every so many years to adjust for astronomical factors etc) such as that the length of a day is increasing (as Earth's angular momentum gradually is transferred away)
Other institutions around the world coordinate with this baseline and their own position relative to NIST source and the centre of the Earth's gravity - as well as adjusting for the relativistic separation, in order to provide a coordinated time around the globe.
This is known as UTC and whilst based on earlier timezone models is (despite most simplified common application) not solely restricted to a basic 15 degree quantisation of hourly offsets - if you're between two such offsets, the precise UTC offset for your locale will actually be calculable to the minutes, seconds and fractions of seconds, but for general purpose, (such as coordinating trains etc. and the practicality of not having to change your watch every time you travel even a short distance) this is not used. The hourly quantum leaps of timezone offsets serves the purpose of ensuring a coordination and translation of local time.
I think a huge aspect to the question of time accuracy and reliability has to be "why"- Y which I mean, one can undersstand the importance of accurate timing when it is time that is used (as it can be measured more accurately) in signals to/from satellites in determining GPS location (so that car navigation can be accurate to scales of a metre, which is clearly safer and practical) - but for everyday human usage (by which I mean, where people are using a time measurement directly, not by proxy of a satellite system) - Almost every digital clock can provide reasonable accuracy to a hundredth of a second. So this is sufficient for experimental timings and more than sufficient for races (the error margin of reaction times to pressing the stopwatch button far outweighs this level) - so arguably, for simply "telling the time" accuracies of even a minute or two are far more than sufficient for most general purpose.
The issue in this comes when clocks "lose" or "gain" 'time' over (sorry) time. When even ignoiring the nanoseconds of relativistic dilation and apprximating the hours of offset between Cairo and Beijing - Despite my clock saying it's 15:45, I end up late for my appointment - Therefore, I expect that this is the real pertinent point of consideration wrt this thread - how to ensure the reliability of ones personal clock in synchronisation with the "established" local time -
Well of course, every clock is still unique. At home my kitchen clock (cheap batetry operated thing) is always different to the Front Room (aesthetic pendulum analogue) and no doubt, the neighbours have also completely different times on all their clocks. Who's right?
The closest answer can only then be, the NIST clock, as synchronised to the local clock's position.
This synchronisation is key, and internet-connected devices will generally include such a synchronisation feature. However, not all such devices necessarily have accurate geolocation information - my desktop PC for example, does not, so instead, the timezone setting is used. But that's fine, because If my appointment is at a place within the same timezone abut over 8 degrees arc away, I wouldn't want my clock to be half an hour slower/faster than theirs.
If a clock does not have the facility for internet or other synchronisation with NIST or such providers, then a more manual approach is needed. The speaking Clock service exists for this purpose, but is now a little archaic. Otherwise there is no comparison for accuracy. The best approach is to synchronise with any ther device that you can be confident HAS been synchronised with NIST. The alternatives are sychronisation with any other clock for which none can claim to have any "authority" on accuracy or correctness.
Given it's unlikely that one would ignore the gradual drift of a clock accuracy for so long that it became comparable to an hour difference from other local clocks, for the most part, the NIST synchronisation is required only to maintain the minutes. Timezones fix the hours and if your clock includes a calendar, then it's a good chance that asking anybody can reliably tell you the day (at least within two on a weekend ;) ) and the month is pretty much assured!*Atomic clocks are accurate in that the definition of a second is exactly given by the frequency of emission from Caesium. That is the clock is measuring time in the same way that the units are actually provided. Although this doesn't make them any more correct than any other clock, they will always give the exact numbre of seconds of duration.
It should also be noted that what are often marketed as 'atomic clocks' are simply digital clocks that regularly (no pun,sorry) synchronise with an official provider which relates a signal from an actual atomic clock.