I had a thought-process on the issue trying to figure out why we would want more than, say, 50 MB/sec (hard drive speed)...
First, I'd start by thinking about what the end-use of a computer is in basic terms (again, we're talking about household use). The purpose is to display a series of ~60fps images in the screen in response to inputs...sometimes accompanied by sound. Whether that's a game, a video clip or just the response-rate of a text editor, that's the data that a computer outputs. The limit I see on usefulness is the data throughput required for that - which is roughly 5 MB/sec, blu-ray data speed. Nothing a network can do faster than that can have an immediate impact on the user.
Taking that further, right now companies are trying to bring the processing back to their servers, allowing only a remote UI, which protects the program and enables a pay-as-you-go model. With the above throughput, they could apply that to virtually anything, including video games: you could play a graphically intense video game without much of a processor if you had a fast internet connection and the remote server provided the processing power, only sending back the pictures. Still, no need for more than 5 MB/sec.
Now there are limited examples where people might want more processed faster, such as with video editing or in my case, astrophotography image enhancement (this takes a surprising amount of resources, on par with video editing). I suppose it would be cool to edit a full-length blu-ray quality video and have it render in three seconds, sending it to a remote company and back. But I'm not sure companies are going to want to use their processing power in that way...and we still come back to the hard drive limitation of ~50 MB/sec.