I wondered if people could clear up some misconceptions I have about radio waves. 1) If I have a 800 MHz wave, does this mean 800 Mbits per second can be transferred in theory? Could I not change the amplitude of this wave every 0.00000000125 of a second, So when a receiver is reading it whether the amplitude is above a relative threshold would determine what bit it was? Why isn't this a feasible way to send information? 2) Let's build upon assumption number one. Does this mean I can have a wave that is 801 MHz and achieve the same result but with an extra 1 Mbit of information operating in the same space as the 800 MHz channel? If not why not? 3) Are there theoretically an infinite amount of frequencies? For example in assumption 2 I added 1 MHz, but what if I added just 1 extra Hz, what if I added 0.5 Hz, or an even smaller number, is there a point where I cannot distinguish between the 2 frequencies in the same space? 4) Can MIMO keep being made smaller to accommodate bandwidth on the same frequencies?