I am trying to come up with a good data-rate formula for a system. The system sends out 'x' symbols per transaction. The transactions take place sequentially, but there is a delay 'z' between them. 'z' is variable and can even be pseudo-random. Also each transaction takes a time as well, but that time is constant. I tried to derive a formula for symbols sent by a time 't' (in seconds): sent_t = ts(transactions/second) - zt, but that didn't quite work out. Where can I go from here? Since 'z' can be random, I can either make 'z' constant for the formula, or make the formula an average data-rate... Thank you
You seem to be talking about the effects of Phase Noise or Jitter, on error rate here. I Googled those terms and there were loads of hits. Trawl through a few of them a see if anything suits your 'level'. That could be more efficient for you, initially. You could then come back with some more specific queries.
lets call the delay time Tz, the packet time Tp and the number of packets n For convienience assume there are an equal number of packets and delays (you can delete the last Tz if you want). Since every Tz is unique you need to sum those delays (sum)Tz (Like Tz1+Tz2... to Tzn) and add the delay for the number of packets n*Tt to get the total time. You cannot determine how many packets fit in a time without an expression for the interpacket delay. If you know the number of packets, you can determine the total interpacket delay, but not the specific delay between any two sequential packets. That should more you forward a bit. Post clarification if I misinterpreted what you are asking.