This problem is from Spacetime Physics by Taylor
In one second some desktop computers can carry out one million instructions in sequence. Assume that carrying out one of the instruction requires transmission of data from the memory to the processor and transmission of the result back to the memory for storage.
a) What is the maximum average distance between memory and processor in a "one-megaflop" computer? Is this maximum distance increased or decresed if the signal travels through conductors at one half the speed of light in a vacuum?
1 mega-flop = 1 sec / 106 instructions
c = 3 * 108 m/s
The Attempt at a Solution
For the first question asked;
With all that is given, I was not sure how to go about the problem so I just did some dimensional analysis to get a unit of meters.
C * 1 sec/ 10^6 instructions ≈ 300 m/ instruction of light travel time.
This turns out to be incorrect, my TA said that I must take into account that light takes a round trip.
My guess would be to multiply by 2, but doesn't the units of m/instruction take account for the round trip (since it is 'per' instruction')?
Very little is given so I am almost positive that the calculations should not be mathematically complicated.
Can someone tell me what is wrong with my logic?