Why to scopes BW limit to 20MHz?

  • Thread starter Thread starter es1
  • Start date Start date
  • Tags Tags
    Limit
AI Thread Summary
The discussion centers on the common minimum bandwidth limit of 20MHz found in oscilloscopes, with participants questioning why this specific frequency is standard across various models and manufacturers. It is suggested that 20MHz has been a long-standing filter standard, likely linked to historical power supply noise levels and early digital logic circuit probing needs. The Tektronix Technical Marketing Manager notes that this choice has remained consistent for decades, influencing modern scopes. Additionally, there is a query about the implications of using a 20MHz bandwidth for measuring power supply noise versus the full bandwidth for assessing real AC noise. Overall, the conversation highlights the significance of the 20MHz limit in oscilloscope design and its relevance to measurement accuracy.
es1
Messages
324
Reaction score
0
While messing around with some measurements the other day it dawned on me that pretty much every scope I've ever used [*] that has a bandwidth limit setting has a min setting at 20MHz.

What I am wondering is, why 20MHz?

It seems like more than a coincidence that every vendor & model would pick 20MHz. I am sure there are scopes with other min settings in existence but this definitely seems like the most common choice.


* For me this means scopes built in 1990 & onward.
 
Engineering news on Phys.org
What do you mean by minimum setting? About all scopes built in the last 20 years go down to DC and many have an upper limit above 20 Mhz.
 
I mean the built in bandwidth limiting hardware settings. Often there are a couple in a drop down menu, like: 20MHz, 150MHz & Full.

It seems like the minimum setting is almost always 20MHz (at least it is for all the Aglient, Tek and LeCroy scopes I've seen lately). I was just wondering if there was a reason for 20MHz being the minimum, as opposed to 10MHz or something.

See for example this scope data sheet:
http://www.testunlimited.com/pdf/Tektronix_MSO_DPO3000.pdf
Page 11: Spec = Hardware Bandwidth Limits
 
20 MHz has been a standard bandwidth (and thus noise and interference) limiting filter in 'scopes for about 40 years or so. Although I don’t know the exact reason why 20 MHz was chosen I believe that it had something to do with either standards for power supply noise levels or perhaps it was a reasonable speed for generic probing of early digital logic circuits. In any event, once the 20 MHz bandwidth was chosen it became the de facto standard and for consistency was never changed from model to model up to and including the scopes we make today.
Hope this helps,
Tektronix Technical Marketing Manager.
 
Cost vs. selling price vs. market expectations/budgets
 
Hi TekScopeGuru, and everyone,

I do have this related question for you. My understanding is that power supply, including VREG, vendors, usually specify their output noise level based on the 20MHz BW. My question is, does this seem more like supply DC ripple rather than noise because to me, that is what I see on the scopes all the times on all type of supply types ?

If one of these supplies go to my processor or controller, the full BW of the output supply would be a big concern to me; therefore, measurement in this mode would be full BW rather than 20 MHz band-limited. My current understanding is that 20 MHz BW measurement is DC ripple, while that with whatever full BW of your scope is real AC noise. Am I correct here ?

Can you explain a little bit more about this.

Thanks.
 
Hi all I have some confusion about piezoelectrical sensors combination. If i have three acoustic piezoelectrical sensors (with same receive sensitivity in dB ref V/1uPa) placed at specific distance, these sensors receive acoustic signal from a sound source placed at far field distance (Plane Wave) and from broadside. I receive output of these sensors through individual preamplifiers, add them through hardware like summer circuit adder or in software after digitization and in this way got an...
I have recently moved into a new (rather ancient) house and had a few trips of my Residual Current breaker. I dug out my old Socket tester which tell me the three pins are correct. But then the Red warning light tells me my socket(s) fail the loop test. I never had this before but my last house had an overhead supply with no Earth from the company. The tester said "get this checked" and the man said the (high but not ridiculous) earth resistance was acceptable. I stuck a new copper earth...
I am not an electrical engineering student, but a lowly apprentice electrician. I learn both on the job and also take classes for my apprenticeship. I recently wired my first transformer and I understand that the neutral and ground are bonded together in the transformer or in the service. What I don't understand is, if the neutral is a current carrying conductor, which is then bonded to the ground conductor, why does current only flow back to its source and not on the ground path...
Back
Top