It's been at least 5 or 6 years since I've done any form of calculus so I'm very rusty :( I'm trying to determine the time it takes to achieve constant velocity of a motor I'm trying to spec out/replace. Here's what I've got: We have a motor initially at rest which is driving a platform and then comes to rest again at 6.3s. I'm considering linear motion here. Once the motor is turned on, I assumed the acceleration to be linear until it reaches constant acceleration (and eventually constant velocity). I know this is not a realistic assumption however, it gives me a good idea of values. This gives rise to 3 sections on a graph and 4 time periods. t0 = 0s, t1, t2 and t3 = 6.3s. 0<t<t1 accel. is linearly increasing, t1<t<t2 is constant accel. and t>t2 accel. = 0. My additional values are as follows: constant acceleration is 45mm/s^2 and constant velocity is 6mm/s. In other words: a(0) = v(0) = 0 a(t1) = 45 mm/s^2 a(t2) = a(t3) = 0 v(t2) = v(t3) = 6 mm/s I've tried solving this in the 3 segments (as outlined above). 0<t<t1: a(t) = (45/t1)*t; v(t) = (45/2*t1)*t^2 + C; evaluating the definite a(t) integral from 0 -> t1, v(t) = (45*t1)/2 I want to solve for t1...this is where I get lost. I say that t1 must equal t in order for me to get 45. That doesn't make sense though because t can be any real number which doesn't help me...moving forward. t1<t<t2: a(t) = 45; v(t) = 45t + C; evaluating the definite a(t) integral from t1 -> t2, v(t) = 45*(t2-t1) Now I want to solve for t2. I should know t1 from the above equation. If I say t1 = 1s, then t2 = 1.133s. This seems to make sense to me, but I can't be 100% certain. Any help is appreciated.