How to know the runtime of triggers

  • Thread starter ChrisVer
  • Start date
  • Tags
    Runtime
  • #1

ChrisVer

Gold Member
3,381
463
I am not if this belongs here or anywhere else, but ok...
Suppose we create an algorithm that does some online job during the data-acquisition of the run of a detector... How can we know whether our algorithm is slow for our needs?
For example ATLAS has some trigger levels and the Level 2 + Event Filter are software-based triggers... I guess that for software all you have to do is give it some fake data and let the PC measure its runtime, right? Then you know if it's able to process the # of data you want within your time limits. (eg I have read that L2 operates with 50ms/event)
But what about hardware triggers (like L1)? which of course takes all the "hard work", since it has all the events (40 evts/μs) and has to be done within 2.5μs.. Well, my problem is not how it can do it [just make it do a rough decision] but how we know that it operates at such times.
 

Answers and Replies

  • #2
You can simulate the hardware (you have to develop it...), and once you have it as physical hardware, you can also directly feed it simulated data on a hardware level to test it.
 
  • #3
You can set an external bit when the routine is called, then clear it when it has been handled.
Watching that pin with an oscilloscope will show you the time needed to handle the event. Trigger on the leading edge.
For a hardware interrupt you can set an RS flip-flop with the hardware interrupt signal, then clear the F/F by software when done.
 
  • Like
Likes FactChecker

Suggested for: How to know the runtime of triggers

Replies
17
Views
3K
Replies
5
Views
3K
Replies
9
Views
826
Replies
4
Views
993
Replies
5
Views
211
Replies
11
Views
816
Replies
10
Views
874
Replies
36
Views
2K
Back
Top