Johnson and Shot noise the Frequency Term

AI Thread Summary
The frequency term in Johnson and Shot noise formulas refers to the bandwidth of the detecting or analyzing system, which is crucial for accurately capturing signals. In the context of a photodiode, this bandwidth corresponds to the range of frequencies being detected. Even in DC circuits, frequency is relevant since DC signals can have variations over time, and the concept of frequency is relative to the measurement context. Low-frequency noise measurement requires longer sampling times to ensure accurate data, particularly for 1/f noise. Understanding these nuances is essential for effective noise measurement and analysis in various electronic systems.
Master J
Messages
219
Reaction score
0
I am a bit confused over the frequency term that appears in Johnson and Shot noise formulae. What is this term?

Is it bandwidth? Of what? In relation to say a photodiode, is it the range of frequencies being detected?

And how would this relate to say a DC circuit which no frequency terms at all?
 
Engineering news on Phys.org
Strictly it's the bandwidth of the detecting/analyzing system/circuit. This could also be the bandwidth of a desired signal since you must use the same bandwidth to detect the signal correctly.

There are cases when it's something a bit different when you have a internal device feedback loop that can see the full noise bandwidth even when the external system has a lower bandwidth - but for most purposes it's the above.

In terms of DC, DC does have frequency. In the words of a microwave engineer I knew who was presented with presentation about "DC testing": "Yes, but what frequency DC?" The guy presenting wasn't a microwave/RF guy so he didn't understand the question. You see for her, 30 MHz was DC. That's the cut-off of an HP 8510 network analyzer's "DC input" that she used. It's all relative.

And at some point in even a "DC" circuit you had to turn on the DC and then later turn off the DC so you minimally have a transfer AC component equal to 1/(time on). But even a DC source isn't perfectly constant - there is always some d/dt.

When you deal with low frequency noise, DC is also relative. Consider that 0.1 Hz is 10 seconds. So when you measure low frequency noise you have to sample (in this case) for say 100 seconds to get 5 point (Nyquist sampling). This is part of the "fun" of measuring noise. This "low frequency/long time" integration is especially common with 1/f noise measurement.
 
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...

Similar threads

Back
Top