What is the Significance of 32.768 kHz Crystals in Timekeeping?

In summary, computers use internal oscillators to keep track of time by counting the number of oscillations. The original PC used a crystal with a frequency of 4.77272 mhz and a 8254 chip to divide the frequency. Modern computers have this logic integrated and can also use other clock sources. Digital alarm clocks often use a 32.768 kHz crystal, which has 2^15 oscillations per second and is often temperature compensated.
  • #1
wil3
179
1
Is there some process by which a computer is able to keep track of time in a linear order? Ie, how can a computer determine when a fixed time increment of arbitrary length has passed?

Please don't dismiss this until you've at least thought about it a little bit. Is there a specific fixed frequency signal that serves as the basis of unit time for a computer? I was thinking it had something to do with the 60 hz signal, but that seems to irregular to give the machine a definite time interval.

This question can be generalized to digital systems of timekeeping in general. Given varying power and resistance in different conditions, what allows constant, steady timekeeping?
 
Computer science news on Phys.org
  • #2
Computers have internal, reliable 'oscillators' which have a known period; thus keeping track of the number of oscillations yields the time interval.
 
  • #3
Okay, that's what I was curious about: what sort of oscillator keeps time that accurately? An LC seems like it would have a little bit of R, and so the system would lose energy. Is the "oscillator" just a very high quality LC, such that time loss is negligible?
 
  • #4
Digital clocks commonly use oscillations of http://en.wikipedia.org/wiki/Quartz_clock" .
 
Last edited by a moderator:
  • #5
The original PC used a crystal with a frequency of 4.77272 mhz. It included an 8254 that divided this frequency by 4 so that it's channel 0 ran at 1.19318 mhz or 838.0965 nsecs / cycle. This was used to drive the dram refresh rate and also was divided by 65536 to produce the 54.9255 ms ticker, about 18.2 ticks per second in DOS mode. There was also a real time clock in a PC that ran off a battery or capicator to keep track of time when the computer is turned off.

http://www.beaglesoft.com/mainfaqclock.htm

Modern computers have this logic integrated and/or emulated in their chip sets. Windows XP defaults to a 64hz ticker rate insetad of the classic 18.2 hz ticker. I'm not sure about Vista or Windows 7. There are also other clock sources on a modern PC.
 
Last edited:
  • #6
If you have a digital alarm clock, the heart of it is probably also a Real Time Clock (RTC, though some are fancier than others). Because the 50/60 Hz is so exacting, many older (alarm) clocks that ran off of AC power actually did use that as their time base (sometimes with, and sometimes without a switch to adjust between the two modes).

32.768 kHz crystals are very popular for time-keeping applications (and many RTCs are designed to use them). Why 32.768 kHz? If you're keeping track of 'ticks' (i.e. the oscillations), there's exactly 2^15 of them in one second, or, in binary 0b1000000000000000 (hexadecimal 0x8000). Assuming you're using it at the proper temperature, or that the crystal is temperature compensated (many are temperature sensitive).
 

1. How do computers measure time?

Computers use a component called a clock chip, which contains a quartz crystal that vibrates at a specific frequency. This vibration is converted into electrical signals, which are then used to measure time.

2. How accurate are computer clocks?

Computer clocks are very accurate, with most modern computers having an accuracy of 1-2 seconds per day. However, this accuracy can decrease over time due to factors such as temperature changes and aging of the quartz crystal.

3. How do computers adjust for time zone changes?

Most modern operating systems have a built-in function that automatically adjusts the computer's clock for time zone changes. This is done by connecting to a time server, which provides the correct time for the specified time zone.

4. What is the difference between the system clock and the real-time clock?

The system clock is used by the computer's operating system to keep track of time and is typically set by the user. The real-time clock is a separate component that is powered by a battery and keeps track of time even when the computer is turned off.

5. How do computers handle leap seconds?

Leap seconds, which are added to Coordinated Universal Time (UTC) to keep it in sync with the Earth's rotation, are handled differently by different systems. Some systems will add the extra second at the end of the day, while others may spread it out over the course of a day. This can lead to slight discrepancies in time between different computers and systems.

Similar threads

  • STEM Academic Advising
Replies
13
Views
2K
  • Math Proof Training and Practice
4
Replies
105
Views
12K
  • General Math
4
Replies
125
Views
16K
  • Astronomy and Astrophysics
3
Replies
80
Views
24K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
8
Views
3K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
7
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
9
Views
2K
Back
Top