1. The problem statement, all variables and given/known data A car's velocity as a function of time is given by vx(t) = a + bt2, where a=3.00m/s and b=0.100m/s3. Calculate the average acceleration from interval t=0 to t=5.00s. 2. Relevant equations Ave Acceleration = v2-v1/t2-t1 3. The attempt at a solution After plugging in the numbers it's: (3+0.1t2)/5-0 = 1.1m/s Which after 2 attempts online it's wrong. I even tried it with 1.10 to see if it picks up sig figs, but it's still wrong. Does it have something to do with m/s3? Thanks.