OK I did some scenarios on paper and it looks like the solution is stupidly simple - the probability of source A running at 250ms intervals being ahead of source B running at 100ms intervals is 0.1 ... I don't know if anyone knows how to prove this more elegantly though..
Hi
My prob theory is rusty and I am a little embarrassed I can't figure this one out - but:
If we have two sources A and B that are producing an output at different frequencies - say A produces outputs every 100ms and B produces an output every 250ms .. Obviously even if they are perfectly...