How Much Power Does a Cell Phone Emit During a Call?

  • Thread starter Thread starter amanno
  • Start date Start date
  • Tags Tags
    Mobile Power
AI Thread Summary
Cell phones emit transmission power ranging from about 100 mW to 1 W during calls, depending on their distance from the cell tower. To detect phone calls using a receiving antenna, it is necessary to amplify the RF signal and down-convert the bands to baseband for accurate monitoring. The signal strength diminishes significantly over distance, so a close proximity of around three feet is ideal for effective detection. Using a diode detector can help in identifying the signal, but care must be taken to avoid interference from other signals. Overall, understanding the specifics of signal processing and amplification is crucial for successful detection of phone call activity.
amanno
Messages
22
Reaction score
0
Hey guys,

Does anyone know about how much transmission power is outputted from a cell phone call? I have read it's around 100 mW (depending on distance from the tower), anyone ever verified this?

I want to try and use the signal created by a phone call (to detect when a phone call is made), so I have a receiving antenna and I am guessing I would need to amplify the RF signal before continueing correct?

I am assuming that a signal with 100mW of output power would be very weak (uW or nW) by the time it reaches the end of my receiving antenna?

Thanks
 
Engineering news on Phys.org
Yes, a phone will adjust its output power between about 100mW and 1W depending on the range to the cell site.

There are going to be some complications. Your cell phone may exchange data with the local cell site even when you are not making calls. Your cell phone may transmit on up to 4 different bands.

Unless the phone is very close, to monitor the activity you will need to use a mixer to down-convert the bands being used to a baseband, then amplify those signals with a receiver chip that generates the usual logarithmic “received signal strength indicator” signal (RSSI). You can then detect when the RSSI exceeds a specified DC threshold that you have set. That will tell you when a nearby phone is transmitting. If you want to avoid autonomous chirps triggering your detector you will need to use a time gate to wait for a minimum time with a continuous signal detected before triggering the output.

To specify signal levels we need to know the maximum distance between the cell phone of interest and your detector's antenna. What is that range?
 
Very interesting idea, thank you.

I wasn't planning on the detectors antenna being too far from the phone itself, a couple of feet maybe (3ft). Unless you think it can go farther?

For my own knowledge: why do you need to down convert the bands (lets assume I am only going to use the 1900 MHz band)?
 
Last edited:
If the distance to the phone is significant then many other signals will exceed the phone signal in your detector's antenna. Amplification is obviously then not an option, without some form of pre-selection.

Down conversion is needed to use RSSI.

At three feet I would expect over a milliwatt so you should be able to use a diode detector like in an “RF sniffer” or a “bug detector”. Take a look at the more advanced examples here; http://dx.com/s/bug+detector
 
Hi all I have some confusion about piezoelectrical sensors combination. If i have three acoustic piezoelectrical sensors (with same receive sensitivity in dB ref V/1uPa) placed at specific distance, these sensors receive acoustic signal from a sound source placed at far field distance (Plane Wave) and from broadside. I receive output of these sensors through individual preamplifiers, add them through hardware like summer circuit adder or in software after digitization and in this way got an...
I have recently moved into a new (rather ancient) house and had a few trips of my Residual Current breaker. I dug out my old Socket tester which tell me the three pins are correct. But then the Red warning light tells me my socket(s) fail the loop test. I never had this before but my last house had an overhead supply with no Earth from the company. The tester said "get this checked" and the man said the (high but not ridiculous) earth resistance was acceptable. I stuck a new copper earth...
I am not an electrical engineering student, but a lowly apprentice electrician. I learn both on the job and also take classes for my apprenticeship. I recently wired my first transformer and I understand that the neutral and ground are bonded together in the transformer or in the service. What I don't understand is, if the neutral is a current carrying conductor, which is then bonded to the ground conductor, why does current only flow back to its source and not on the ground path...
Back
Top