Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How to know the runtime of triggers

  1. May 6, 2016 #1


    User Avatar
    Gold Member

    I am not if this belongs here or anywhere else, but ok...
    Suppose we create an algorithm that does some online job during the data-acquisition of the run of a detector... How can we know whether our algorithm is slow for our needs?
    For example ATLAS has some trigger levels and the Level 2 + Event Filter are software-based triggers... I guess that for software all you have to do is give it some fake data and let the PC measure its runtime, right? Then you know if it's able to process the # of data you want within your time limits. (eg I have read that L2 operates with 50ms/event)
    But what about hardware triggers (like L1)? which of course takes all the "hard work", since it has all the events (40 evts/μs) and has to be done within 2.5μs.. Well, my problem is not how it can do it [just make it do a rough decision] but how we know that it operates at such times.
  2. jcsd
  3. May 7, 2016 #2


    User Avatar
    2017 Award

    Staff: Mentor

    You can simulate the hardware (you have to develop it...), and once you have it as physical hardware, you can also directly feed it simulated data on a hardware level to test it.
  4. May 7, 2016 #3


    User Avatar
    Science Advisor

    You can set an external bit when the routine is called, then clear it when it has been handled.
    Watching that pin with an oscilloscope will show you the time needed to handle the event. Trigger on the leading edge.
    For a hardware interrupt you can set an RS flip-flop with the hardware interrupt signal, then clear the F/F by software when done.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted