Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Average Time to do read data

  1. Jul 13, 2016 #1
    Hi there,
    I'm trying to model how much faster we will be able to pull data from a set of controllers.
    I know how long it takes to read a single data point from each controller.
    Right now we are reading each and every data point.

    The change I would like to model is some of the data isn't read each pass. But instead read after a period of time during that next pass.

    So if we have 2 sets of data A and B
    Right now each pass reads A and B and takes x seconds

    The change will result in
    Pass 1: A & B
    Pass 2: A
    Pass 3: A
    ... so many seconds elapse:
    Pass N: A& B

    How do I figure out the average time it takes to do a pass in the second case??

    Thanks :)
     
  2. jcsd
  3. Jul 13, 2016 #2

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    How does that look in time?

    "We read A for two seconds, then B for two seconds, then A for two seconds, then A for two seconds,. ..."?
    Then all you need is the average fraction B that gets read, multiply by the time it takes, add the time for A and you have your average time per pass.
     
  4. Jul 13, 2016 #3
    Thats what I originally did.
    The problem I ran into is when I follow that through:
    A: Time to read the data each loop
    B: Time to read the data on
    C: the fraction of scans where B is read
    Time = A + C*B

    So now I need to figure out what C is which I figure is Time/M (M is how often to scan the data)
    So solving that out I end up with Time = A*M/(M-B)
    Which doesn't seem right.
     
  5. Jul 13, 2016 #4

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Looks right to me.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted