Why to scopes BW limit to 20MHz?

  • Thread starter Thread starter es1
  • Start date Start date
  • Tags Tags
    Limit
Click For Summary

Discussion Overview

The discussion revolves around the common practice of setting a minimum bandwidth limit of 20MHz on oscilloscopes. Participants explore the reasons behind this standard, its historical context, and its implications for measuring power supply noise and ripple.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant notes that many oscilloscopes have a minimum bandwidth limit setting of 20MHz and questions why this specific frequency is commonly chosen.
  • Another participant clarifies that modern scopes typically have a range of bandwidth settings, often including options below 20MHz.
  • A participant mentions that 20MHz has been a standard bandwidth limit for oscilloscopes for about 40 years, suggesting it may relate to historical standards for power supply noise or early digital logic circuit probing.
  • There is a mention of market factors such as cost and pricing expectations influencing the choice of bandwidth limits.
  • A participant raises a related question about the interpretation of measurements at 20MHz bandwidth, suggesting that it may reflect DC ripple rather than true AC noise, and seeks clarification on this distinction.

Areas of Agreement / Disagreement

Participants express differing views on the reasons for the 20MHz limit, with some attributing it to historical standards and others questioning its relevance to modern applications. The discussion remains unresolved regarding the implications of bandwidth limits on noise measurements.

Contextual Notes

Some assumptions about the relationship between bandwidth limits and measurement accuracy are not fully explored, and there is a lack of consensus on the significance of the 20MHz setting in contemporary contexts.

es1
Messages
324
Reaction score
0
While messing around with some measurements the other day it dawned on me that pretty much every scope I've ever used [*] that has a bandwidth limit setting has a min setting at 20MHz.

What I am wondering is, why 20MHz?

It seems like more than a coincidence that every vendor & model would pick 20MHz. I am sure there are scopes with other min settings in existence but this definitely seems like the most common choice.


* For me this means scopes built in 1990 & onward.
 
Engineering news on Phys.org
What do you mean by minimum setting? About all scopes built in the last 20 years go down to DC and many have an upper limit above 20 Mhz.
 
I mean the built in bandwidth limiting hardware settings. Often there are a couple in a drop down menu, like: 20MHz, 150MHz & Full.

It seems like the minimum setting is almost always 20MHz (at least it is for all the Aglient, Tek and LeCroy scopes I've seen lately). I was just wondering if there was a reason for 20MHz being the minimum, as opposed to 10MHz or something.

See for example this scope data sheet:
http://www.testunlimited.com/pdf/Tektronix_MSO_DPO3000.pdf
Page 11: Spec = Hardware Bandwidth Limits
 
20 MHz has been a standard bandwidth (and thus noise and interference) limiting filter in 'scopes for about 40 years or so. Although I don’t know the exact reason why 20 MHz was chosen I believe that it had something to do with either standards for power supply noise levels or perhaps it was a reasonable speed for generic probing of early digital logic circuits. In any event, once the 20 MHz bandwidth was chosen it became the de facto standard and for consistency was never changed from model to model up to and including the scopes we make today.
Hope this helps,
Tektronix Technical Marketing Manager.
 
Cost vs. selling price vs. market expectations/budgets
 
Hi TekScopeGuru, and everyone,

I do have this related question for you. My understanding is that power supply, including VREG, vendors, usually specify their output noise level based on the 20MHz BW. My question is, does this seem more like supply DC ripple rather than noise because to me, that is what I see on the scopes all the times on all type of supply types ?

If one of these supplies go to my processor or controller, the full BW of the output supply would be a big concern to me; therefore, measurement in this mode would be full BW rather than 20 MHz band-limited. My current understanding is that 20 MHz BW measurement is DC ripple, while that with whatever full BW of your scope is real AC noise. Am I correct here ?

Can you explain a little bit more about this.

Thanks.
 

Similar threads

Replies
12
Views
3K
  • · Replies 11 ·
Replies
11
Views
6K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 53 ·
2
Replies
53
Views
7K
  • · Replies 22 ·
Replies
22
Views
5K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
3K
Replies
4
Views
7K
Replies
4
Views
3K