Hey everyone. I'm beginning my exciting yet very nerve wracking road to a BS in computer science. I'm attending a university in NJ and hope to do well here. I love communications, networking, and all that jazz. Any who, one question has been bothering me. In my studying of computers, and how they work, I've run into one question that has been bothering me. Why are computer speeds, (cpu, buses, and RAM) all measured in Hertz? To me Hertz describes something that's frequently 'waving' or counting. For example you can measure radio and gamma waves in hertz, because there's something moving in a frequent manner. But what's moving in a cpu that would allow it to be measured in hertz? When people refer to a CPU as having a 3.2 GHZ speed. It means that the processor is sending out info at 3 billion clock cycles.. But it just brings up the question, what is cycling? how long is a cycle measured? The speed aspect of this is a bit confusing to me and I'd appreciate your discussion! I hope to hang out here on these forums, and gather as much information as possible, so excuse the elementary questions..