- #1
- 3,381
- 463
I am not if this belongs here or anywhere else, but ok...
Suppose we create an algorithm that does some online job during the data-acquisition of the run of a detector... How can we know whether our algorithm is slow for our needs?
For example ATLAS has some trigger levels and the Level 2 + Event Filter are software-based triggers... I guess that for software all you have to do is give it some fake data and let the PC measure its runtime, right? Then you know if it's able to process the # of data you want within your time limits. (eg I have read that L2 operates with 50ms/event)
But what about hardware triggers (like L1)? which of course takes all the "hard work", since it has all the events (40 evts/μs) and has to be done within 2.5μs.. Well, my problem is not how it can do it [just make it do a rough decision] but how we know that it operates at such times.
Suppose we create an algorithm that does some online job during the data-acquisition of the run of a detector... How can we know whether our algorithm is slow for our needs?
For example ATLAS has some trigger levels and the Level 2 + Event Filter are software-based triggers... I guess that for software all you have to do is give it some fake data and let the PC measure its runtime, right? Then you know if it's able to process the # of data you want within your time limits. (eg I have read that L2 operates with 50ms/event)
But what about hardware triggers (like L1)? which of course takes all the "hard work", since it has all the events (40 evts/μs) and has to be done within 2.5μs.. Well, my problem is not how it can do it [just make it do a rough decision] but how we know that it operates at such times.