Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why and How, are Computer speeds measured in Hertz?

  1. Aug 24, 2011 #1
    Hey everyone. I'm beginning my exciting yet very nerve wracking road to a BS in computer science. I'm attending a university in NJ and hope to do well here. I love communications, networking, and all that jazz. Any who, one question has been bothering me. In my studying of computers, and how they work, I've run into one question that has been bothering me.

    Why are computer speeds, (cpu, buses, and RAM) all measured in Hertz? To me Hertz describes something that's frequently 'waving' or counting. For example you can measure radio and gamma waves in hertz, because there's something moving in a frequent manner. But what's moving in a cpu that would allow it to be measured in hertz? When people refer to a CPU as having a 3.2 GHZ speed. It means that the processor is sending out info at 3 billion clock cycles.. But it just brings up the question, what is cycling? how long is a cycle measured?

    The speed aspect of this is a bit confusing to me and I'd appreciate your discussion! I hope to hang out here on these forums, and gather as much information as possible, so excuse the elementary questions..
     
  2. jcsd
  3. Aug 24, 2011 #2

    rcgldr

    User Avatar
    Homework Helper

    That 3.2 ghz speed is the clock input to the cpu (there could be a slower clock and clock multiplier in the cpu). The clock controls the timing of the gates inside the cpu, (the rate at which the gates can change states with some margin for transition and stabilizing time).

    How the clock frequency is generated is through a very accurate clock generator, like a crystal, that then goes through a frequency multiplier that oscillates as some multiple of the clock generator input. The multiplier on it's own isn't that accurate, but is "tuned" at startup (or installation) to sync up with the slower but very accurate clock generator to end up with an accurate high frequency clock output.
     
    Last edited: Aug 24, 2011
  4. Aug 24, 2011 #3
    hmm I think I get it... So the clock sets the tempo for all the chips on the board? Would that mean that the clock can beat at a faster speed than the CPU itself?

    So the clock sends a unifying tempo to the entire mother board,telling everyone to beat at, lets say, 2.8 MHZ... then the CPU multiplies that so it can move a bit faster?

    So each time the CPU and all the other chips send out and receive information, that counts as a cycle?
     
  5. Aug 24, 2011 #4

    DaveC426913

    User Avatar
    Gold Member

    Indeed. Overclocking is the technique of speeding up the rate of the CPU.

    Here's a primer: http://en.wikipedia.org/wiki/Overclocking

    Chips can operate a lot faster than their rated clock speed, but start to get flaky. Manufacturers set the clock rate low enough to ensure no failures.

    Your 3Ghz processor might be sped up to 4 or 5Ghz with no problem, but manus can't do this when they make them in bulk. Also, it will heat up a LOT more, and live a shorter life.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Why and How, are Computer speeds measured in Hertz?
Loading...