My application is a pulsed radio transmitter. Ideally I would like to be able to use 5 μs pulses spaced about 4 seconds apart. Each pulse would represent 1 bit. So my bit rate (and my symbol rate) would be around 0.25 bps. Just based on signal theory alone I think 0.25 bps would allow for a minumum bandwidth of 0.25 Hz for double sideband AM or as narrow as 0.125 Hz for single sideband or more efficient methods. I would either be using no modulation or pulse position modulation. The problem is that a 5 μs pulse has an instantaneous bandwidth of 200 kHz and this reduces my range to such an extent that the system is just not viable. My understanding is that this loss of range is due to the fact that the receiver has to listen to a much wider range of frequencies and this increases noise such that the weak signal from my transmitter is lost in a sea of noise. Because of this I cannot just go out and buy the transmitter. The maximum pulse length in an appropriate commercial device is 5 μs. According to my link budget calcs I would need a pulse length of something like 500 ms, and no one makes anything like that. I would have to design and build it myself. I figure this will take me at least 5-10 years. So I am not too happy about it. Nevertheless in a grasping at straws kind of way I am wondering if there is any conceivable method for getting around the short pulse induced bandwidth problem from the receiver side. The recent thread on amplitude modulation has got me wondering. Short pulses create a kind of OOK modulation. In the other thread it was pointed out that if a carrier is amplitude modulated with a simple sine wave that the sidebands will be virtually monochromatic. All the spectral energy from the transmitter would be concentrated in just 3 frequencies. With raised cosine or Gaussian pulse shaping I would hope to get close to this theoretical ideal of a sinusoidal baseband signal. Considering the simpler case of no pulse position modulation I have to wonder if it might be possible to design a receiver which only listens for those 3 frequencies and thus neutralizes the range penalty due to noise in the receiver. Why should the receiver have to listen to all that empty space between the sideband waves and the carrier wave? It would seem to just add noise. Adding in the PPM, as I would like to do, may complicate the situation of course. By increasing the bandwidth it would presumably add more sideband frequencies that the receiver has to detect. So my question is whether there is any chance of this sort of scheme working. And if not, is it the kind of thing that is just impossible and will always be impossible? That is, what do you think the chances are of us finding a way around this limitation in the next century or millennium? Do you think we will still have this problem with short pulse transmissions in 2112 or 3012?