A 50.0 kg rocket is launched straight up (we’ll call this the y direction). Its motor produces constant acceleration for 10.5 seconds and stops. At the time of 12.5 seconds the altitude of this rocket is 333 m. (ignore air resistance and take g=9.80m/s^2)
a. What is the rockets acceleration during the first 10.5 seconds?
d = v0t +1/2at^2
v = v0 +at
The Attempt at a Solution
So, I keep trying to model this question with 2 equations. And setting d1 = 333 - d2 (d1 being distance traveled by rocket while rocket was accelerating and d2 being the remaining distance out of the 333m total traveled) and using this method I try to find a1 ( the acceleration of the rocket during the first 10.5 seconds) and using a2 = -9.8m/s^s for gravity.
After running through this, I got the acceleration to be 3.29m/s^2 for the rocket, but when I use this to see the total distance traveled in 10.5s + the distances traveled the last 2 seconds, I get something like 240m total, which is obviously wrong.
What is the best way to approach this problem? Also if there is a way to solve this problem using DEQ's or LA to make it easier, that would be nice to know as well!