- #1
- 501
- 2
I was thinking about a colloquim speaker I heard once and I have a few unanswered questions. He was talking about the manufacturing process for the read/write head for a hard drive. Obviously, the goal was to make the part smaller and smaller to increase the possible information density on the disk. My question is...
What is the limiting factor for maximum information density on a magnetic disk? Size of the read/write head, mechanical limitations for moving the arm, or the size of the actual magnetic dipole that encodes the bit on the disk ? Or perhaps something I'm not even thinking of.
Secondly, I wonder about ports. At first, there were serial ports. Then came along parallel ports, allowing faster information exchange. However, the newest ports (USB/Firewire) have returned to serial communication. Was there a singular burst in the information that could be carried by a single wire that allowed this to occur?
Thirdly, I recall from my electronics class that there were several conventions that existed for classifying a bit. For example, 5V was a 1, while 0V was a 0. What is the current convention. I assume it is a minimum voltage to minimize power use, but large enough to consistently carry the bit.
Finally, what would be the theoretical and practical limit of information density in a single wire. Would assuming an ideal conductor change this limit?
I hope my random musings at least make sense. Thanks.
What is the limiting factor for maximum information density on a magnetic disk? Size of the read/write head, mechanical limitations for moving the arm, or the size of the actual magnetic dipole that encodes the bit on the disk ? Or perhaps something I'm not even thinking of.
Secondly, I wonder about ports. At first, there were serial ports. Then came along parallel ports, allowing faster information exchange. However, the newest ports (USB/Firewire) have returned to serial communication. Was there a singular burst in the information that could be carried by a single wire that allowed this to occur?
Thirdly, I recall from my electronics class that there were several conventions that existed for classifying a bit. For example, 5V was a 1, while 0V was a 0. What is the current convention. I assume it is a minimum voltage to minimize power use, but large enough to consistently carry the bit.
Finally, what would be the theoretical and practical limit of information density in a single wire. Would assuming an ideal conductor change this limit?
I hope my random musings at least make sense. Thanks.