How to know the runtime of triggers

  • Thread starter ChrisVer
  • Start date
  • #1
ChrisVer
Gold Member
3,331
438

Main Question or Discussion Point

I am not if this belongs here or anywhere else, but ok...
Suppose we create an algorithm that does some online job during the data-acquisition of the run of a detector... How can we know whether our algorithm is slow for our needs?
For example ATLAS has some trigger levels and the Level 2 + Event Filter are software-based triggers... I guess that for software all you have to do is give it some fake data and let the PC measure its runtime, right? Then you know if it's able to process the # of data you want within your time limits. (eg I have read that L2 operates with 50ms/event)
But what about hardware triggers (like L1)? which of course takes all the "hard work", since it has all the events (40 evts/μs) and has to be done within 2.5μs.. Well, my problem is not how it can do it [just make it do a rough decision] but how we know that it operates at such times.
 

Answers and Replies

  • #2
34,038
9,877
You can simulate the hardware (you have to develop it...), and once you have it as physical hardware, you can also directly feed it simulated data on a hardware level to test it.
 
  • #3
Baluncore
Science Advisor
2019 Award
7,407
2,452
You can set an external bit when the routine is called, then clear it when it has been handled.
Watching that pin with an oscilloscope will show you the time needed to handle the event. Trigger on the leading edge.
For a hardware interrupt you can set an RS flip-flop with the hardware interrupt signal, then clear the F/F by software when done.
 

Related Threads for: How to know the runtime of triggers

  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
6
Views
2K
  • Last Post
Replies
7
Views
2K
Replies
7
Views
862
  • Last Post
Replies
17
Views
591
  • Last Post
Replies
1
Views
4K
Replies
4
Views
587
Top