A plane with mass 1000kg lands on a stationary 2000kg-barge at an initial velocity of 50m/s. The only force to consider is the braking force which is 1/4 of the plane's weight (2450N). How long does the barge have to be if the plane lands at one end of the barge and stops at the other end? Here's what my friend did: 1. calculate the acceleration (or rather deceleration) to be -2.45 m/s^2. 2. used v2 = v1 + at to determine time, which works out to be about 20.4s. 3. used conservation of momentum to determine the velocity of the plane-barge system, which is 16.67m/s. 4. used the velocity of the plane-barge system and time from step 2 to determine the distance, which is 340m (the correct answer BTW). My question is, how does it work? I understand step 1, and step 2 sort of makes sense. Step 3 makes sense too, but the velocity of the plane-barge system is moving together, so from the way I understand it, the plane is at rest relative to the barge. Step 4 works out if I assume that the barge does not move. I'm just really confused by all of this.