What is the difference between 3G, 4G, and LTE cellular network standards?

AI Thread Summary
3G, 4G, and LTE are distinct cellular network standards, with 3G being implemented through technologies like UMTS and CDMA2000, while 4G is primarily represented by LTE and LTE Advanced. The confusion often arises from marketing practices that use these terms interchangeably. LTE is a specific technology that meets the 4G standard, while UMTS is one of several technologies that fulfill the 3G standard. Understanding these distinctions is crucial for selecting devices based on desired network speeds and coverage. Clear definitions of these terms help consumers make informed choices about their mobile technology.
TDoan
Messages
2
Reaction score
0
I am very confused with the 3G, 4G, LTE, UTMS, WCDMA terms... Hope someone could shed some light on it. I think complication arises when marketing people start to use these terms interchangeably?

Below is my interpretation, is that correct? Would someone be kind enough to make some comments? Thank you in advanced

My understanding is that 1G, 2G, 3G and 4G are very brief standards created by a body specifying service performance level.

Industries then try to achieve those standard (sometimes exceed them) by create technology standard to implement in the networks

For example,
2G standard is implemented by GSM, PDC, IS-95, etc
3G standard is implemented by UMTS, CDMA2000, HSPDA, HSPDA+, etc
and 4G is implemented by LTE, LTE advanced
 
Engineering news on Phys.org
Here's an article on it the 3G, 4G and 5G generation/datarates terminology:

http://www.whatsag.com/

You would use these terms to select a phone with the speed you want while looking to see if your areas are covered by it.

and here's an article of the standards and technologies that provide the datarates:

http://www.forensicmag.com/article/2010/10/understanding-world-cellular-telephones-part-1

Cell Phone Technology and Access Schemes:

  • CDMA: Code Division Multiple Access. This access scheme breaks a designated frequency into code for customer usage, thereby making efficient use of the available bandwidth. Widely used in the United States, it is becoming more common in other countries. CDMA was originally a military communications technology that Qualcomm converted to civilian use. CDMA handsets normally do not contain a SIM card. Data is stored either in the handset’s memory or on an SD card (SD/Mini-SD/Micro-SD). Handsets use either a hexadecimal or decimal ESN to identify themselves to the network.
  • FDMA: Frequency Division Multiple Access. Access schemes divide the designated frequency into various usable components.
  • GSM: Global System for Mobile Communications. GSM uses TDMA technology and SIM cards to track user information. GSM is probably the most common cell phone technology in world-wide use today. Handsets generally contain at least one SIM card which usually contains encryption keys to provide user authentication. The handsets use a fifteen digit IMEI and a fifteen digit IMSI to identify the user and the handset to the network respectively.
  • iDEN: Integrated Digitally Enhanced Network. This technology, referred to as push-to-talk, allows the handset to be used both as a cell phone and a two way radio. The handsets use a small portion of the available frequency spectrum to communicate with the network and normally contain a SIM card.
  • TDMA: Time Division Multiple Access. This access scheme breaks the designated frequency into time slots to utilize the bandwidth more effectively. In combination with FDMA, they make up the entire GSM access scheme.
 
Thanks for your reply. Sure, I get that aspect but that still doesn't explain the term 4G and LTE. whether they are interchangeable terms?
Or is LTE a set of technology that implement the 4G standard?

Similarly 3G and UMTS, is UMTS an implementation of 3G standard? There are many ways to achieve the 3G standard, is UMTS just one of many set of technology
platform to achieve the 3G standard?
 
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...
Back
Top