View Single Post
cool_dude
#1
Sep15-07, 01:28 PM
P: 3
1. The problem statement, all variables and given/known data
A plane accelerates from rest at a constant rate of 5.00 m/s^2 along a runway that is 1800 m long. Assume that the plane reaches the required takeoff velocity at the end of the runway. What is the time t_{TO} needed to take off?


2. Relevant equations
i think the formula i'm supposed to use is d = vt + 1/2at


3. The attempt at a solution
not exactly sure if this is the correct formula but thats what i thought i have to use. we know that d = 1800m and we know that initial velocity = 0 and a = 5 m/s^2 so then we isolate for t. thus we get t = 1800 / 2.5 = 720. however i think this might be wrong because i don't think its realistic for a plane to take 12 minutes on the run way to take off. any ideas?


Also part of the question was What is the distance d_last traveled by the plane in the last second before taking off?

I tried to use the same formula d = vt + 1/2at and since we figured out time from part a we sub (t - 1) into here and get distance traveled in the last second before taking off. not sure if any of this is right. can someone verify?

Thank you
Phys.Org News Partner Science news on Phys.org
Scientists discover RNA modifications in some unexpected places
Scientists discover tropical tree microbiome in Panama
'Squid skin' metamaterials project yields vivid color display