Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why to scopes BW limit to 20MHz?

  1. Jul 18, 2011 #1

    es1

    User Avatar

    While messing around with some measurements the other day it dawned on me that pretty much every scope I've ever used [*] that has a bandwidth limit setting has a min setting at 20MHz.

    What I am wondering is, why 20MHz?

    It seems like more than a coincidence that every vendor & model would pick 20MHz. I am sure there are scopes with other min settings in existence but this definitely seems like the most common choice.


    * For me this means scopes built in 1990 & onward.
     
  2. jcsd
  3. Jul 18, 2011 #2

    Averagesupernova

    User Avatar
    Science Advisor
    Gold Member

    What do you mean by minimum setting? About all scopes built in the last 20 years go down to DC and many have an upper limit above 20 Mhz.
     
  4. Jul 18, 2011 #3

    es1

    User Avatar

    I mean the built in bandwidth limiting hardware settings. Often there are a couple in a drop down menu, like: 20MHz, 150MHz & Full.

    It seems like the minimum setting is almost always 20MHz (at least it is for all the Aglient, Tek and LeCroy scopes I've seen lately). I was just wondering if there was a reason for 20MHz being the minimum, as opposed to 10MHz or something.

    See for example this scope data sheet:
    http://www.testunlimited.com/pdf/Tektronix_MSO_DPO3000.pdf
    Page 11: Spec = Hardware Bandwidth Limits
     
  5. Jul 20, 2011 #4
    20 MHz has been a standard bandwidth (and thus noise and interference) limiting filter in 'scopes for about 40 years or so. Although I don’t know the exact reason why 20 MHz was chosen I believe that it had something to do with either standards for power supply noise levels or perhaps it was a reasonable speed for generic probing of early digital logic circuits. In any event, once the 20 MHz bandwidth was chosen it became the de facto standard and for consistency was never changed from model to model up to and including the scopes we make today.
    Hope this helps,
    Tektronix Technical Marketing Manager.
     
  6. Jul 23, 2011 #5
    Cost vs. selling price vs. market expectations/budgets
     
  7. Aug 16, 2011 #6
    Hi TekScopeGuru, and everyone,

    I do have this related question for you. My understanding is that power supply, including VREG, vendors, usually specify their output noise level based on the 20MHz BW. My question is, does this seem more like supply DC ripple rather than noise because to me, that is what I see on the scopes all the times on all type of supply types ?

    If one of these supplies go to my processor or controller, the full BW of the output supply would be a big concern to me; therefore, measurement in this mode would be full BW rather than 20 MHz band-limited. My current understanding is that 20 MHz BW measurement is DC ripple, while that with whatever full BW of your scope is real AC noise. Am I correct here ?

    Can you explain a little bit more about this.

    Thanks.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Why to scopes BW limit to 20MHz?
  1. Infrared scopes (Replies: 14)

  2. Evaluate used Scope (Replies: 4)

Loading...