# Factors affecting mobile phone signals

1. Mar 9, 2004

### spill

I am currently studying an International Baccalaureate Physics course, and I am currently in my first year of study (i.e. I'm 16). As part of this, I am doing a presentation on cellphones; not my decision, but to get to the point:

If anyone could refer me to some information detailing which factors affect cellphone signal strength, it would be much appreciated. Obviously, I know what they are, but finding detailed scientific information about, say, how important they are relative to one another is proving exceedingly difficult. I have contacted a number of network providers only to recieve apologetic letters saying they cannot go into detail due to the risk of revealing their secrets. Any help would be great, and an equation of some kind would be fantastic. Many thanks

2. Mar 11, 2004

### Deeviant

One easily applicable physics law is the inverse square law.

The electromagnetic radiation that constitutes a cellphone signal will decrease in intensity by the sqare of the distance from source.

If you measure signal strength 1 mile form a cell tower, it should be 4 times weaker at 2 miles from tower, of course assuming that there are no physical obstructions, which is another huge issue that effects cell reception.

You would want to find the intensity of the cell tower transmitter, the minimun amount of signal the cell phone needs to have a acceptable connection and the density of cell towers in the region. Of course, the background radition, ie. noise(normally rated in dB) would have to be related to signal strength and the tolerance of the cellphone to interferance.