Why to scopes BW limit to 20MHz?

  • Thread starter Thread starter es1
  • Start date Start date
  • Tags Tags
    Limit
AI Thread Summary
The discussion centers on the common minimum bandwidth limit of 20MHz found in oscilloscopes, with participants questioning why this specific frequency is standard across various models and manufacturers. It is suggested that 20MHz has been a long-standing filter standard, likely linked to historical power supply noise levels and early digital logic circuit probing needs. The Tektronix Technical Marketing Manager notes that this choice has remained consistent for decades, influencing modern scopes. Additionally, there is a query about the implications of using a 20MHz bandwidth for measuring power supply noise versus the full bandwidth for assessing real AC noise. Overall, the conversation highlights the significance of the 20MHz limit in oscilloscope design and its relevance to measurement accuracy.
es1
Messages
324
Reaction score
0
While messing around with some measurements the other day it dawned on me that pretty much every scope I've ever used [*] that has a bandwidth limit setting has a min setting at 20MHz.

What I am wondering is, why 20MHz?

It seems like more than a coincidence that every vendor & model would pick 20MHz. I am sure there are scopes with other min settings in existence but this definitely seems like the most common choice.


* For me this means scopes built in 1990 & onward.
 
Engineering news on Phys.org
What do you mean by minimum setting? About all scopes built in the last 20 years go down to DC and many have an upper limit above 20 Mhz.
 
I mean the built in bandwidth limiting hardware settings. Often there are a couple in a drop down menu, like: 20MHz, 150MHz & Full.

It seems like the minimum setting is almost always 20MHz (at least it is for all the Aglient, Tek and LeCroy scopes I've seen lately). I was just wondering if there was a reason for 20MHz being the minimum, as opposed to 10MHz or something.

See for example this scope data sheet:
http://www.testunlimited.com/pdf/Tektronix_MSO_DPO3000.pdf
Page 11: Spec = Hardware Bandwidth Limits
 
20 MHz has been a standard bandwidth (and thus noise and interference) limiting filter in 'scopes for about 40 years or so. Although I don’t know the exact reason why 20 MHz was chosen I believe that it had something to do with either standards for power supply noise levels or perhaps it was a reasonable speed for generic probing of early digital logic circuits. In any event, once the 20 MHz bandwidth was chosen it became the de facto standard and for consistency was never changed from model to model up to and including the scopes we make today.
Hope this helps,
Tektronix Technical Marketing Manager.
 
Cost vs. selling price vs. market expectations/budgets
 
Hi TekScopeGuru, and everyone,

I do have this related question for you. My understanding is that power supply, including VREG, vendors, usually specify their output noise level based on the 20MHz BW. My question is, does this seem more like supply DC ripple rather than noise because to me, that is what I see on the scopes all the times on all type of supply types ?

If one of these supplies go to my processor or controller, the full BW of the output supply would be a big concern to me; therefore, measurement in this mode would be full BW rather than 20 MHz band-limited. My current understanding is that 20 MHz BW measurement is DC ripple, while that with whatever full BW of your scope is real AC noise. Am I correct here ?

Can you explain a little bit more about this.

Thanks.
 
Hey guys. I have a question related to electricity and alternating current. Say an alien fictional society developed electricity, and settled on a standard like 73V AC current at 46 Hz. How would appliances be designed, and what impact would the lower frequency and voltage have on transformers, wiring, TVs, computers, LEDs, motors, and heating, assuming the laws of physics and technology are the same as on Earth?
While I was rolling out a shielded cable, a though came to my mind - what happens to the current flow in the cable if there came a short between the wire and the shield in both ends of the cable? For simplicity, lets assume a 1-wire copper wire wrapped in an aluminum shield. The wire and the shield has the same cross section area. There are insulating material between them, and in both ends there is a short between them. My first thought, the total resistance of the cable would be reduced...
I used to be an HVAC technician. One time I had a service call in which there was no power to the thermostat. The thermostat did not have power because the fuse in the air handler was blown. The fuse in the air handler was blown because there was a low voltage short. The rubber coating on one of the thermostat wires was chewed off by a rodent. The exposed metal in the thermostat wire was touching the metal cabinet of the air handler. This was a low voltage short. This low voltage...
Back
Top