Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Bit rate, symbol rate and sampling in LTE

  1. Jul 8, 2011 #1
    Hi guys, I have a problem in understanding LTE. LTE is the long term evolution 4G technology. The LTE requirement is 100 Mbps with 1.3, 2.5, 5, 10, 15 and 20 MHz for 128, 256, 512, 1024, 1536 and 2048 subcarriers.

    Each bandwidth corresponds to each number of subcarriers.

    My problem here is, it has four types 3 types of modulation schemes, QPSK, 16-QAM and 64-QAM. How can all this 3 modulation schemes carry the same bit rate cause each have different number of bits per symbol.

    How does this bit rate work with the bandwidth and subcarriers. Its very complicated.

    Say for a normal On-Off keying, if the bit rate 2 Mbps, then the symbol rate is 2Mbps. because its 1 bit per symbol. But QPSK, 16-QAM and 64-QAM carries 2 bit, 4 bit and 6 bit per symbol. How can all this carry the same bit rate?

    How does the sampling frequency corresponds to the bit rate?

    Say if I want to model define a bit rate if I model this system in Matlab, how can I include the bit rate, what would be the factor or formula I need to include to define a bit rate?

    Please someone explain this. Thank you very much!
  2. jcsd
  3. Jul 9, 2011 #2
    I'm not an expert on this but from the quick look of a few online resources, it seems much of the LTE-Advanced standard is about gluing together the existing CDMA, TDMA and GSM standards together as various physical layers of an ISO stack for legacy reasons (like LAN based on coax vs. twisted pair vs. etc.). By standardizing on the layers above this, existing legacy infrastructure can integrated into a uniform standard at higher layers. So basically the different physical layers will have different capacity per MHz of spectrum.

    The actual prior LTE physical layer with picocells seems to have "tunable" bandwidth performance that takes different channel sizes. AFAICT it's all OFDMA as a modulation scheme for picocells but it's using existing 3G cells coupled with picocells to give spatial "tuning" as well. Basically a picocell gets used in spatial areas of congestion and the original 3G base stations simply "know" to not set up with anything within picocell range based on out-of-band wire-line communication and probably based on MIMO direction finding/triangulation.

    This is only based on 5 minutes of looking at these docs.



Share this great discussion with others via Reddit, Google+, Twitter, or Facebook