So say, I'm writing a program with an infinite loop, and that I'm trying to write a file that is as large as the disk drive is, in the smallest possible time. What would be the best algorithm to do it? Clearly, such an algorithm would consume as many system resources as it possibly could. So it would be limited by the system resources, so we need not concern ourselves with factorials or exponentials. fprintf(fp, "blahblah"); and blahblah would be "output text". Say, blahblah was a huge amount of text, and the loop was a for loop that outputted blahblah an infinite amount of times (it outputted as it went through the loop, so the loop doesn't need to finish for the file to be written). The question is - how much MB/second is usually consumed in the process? (and would there be a maximum value, given limited speed in file writing/reading?) I know that it's correlated with CPU speed. And I don't want to try it myself yet (to avoid stressing out the hard disk)- though it probably has been tried by people who forgot to close the loop up. Anyways, it is conceivable that the entire hard disk space could be eaten up in the matter of seconds? Given that it does take time to transfer system files, I don't think so.