# Motion Problem

1. Sep 21, 2009

1. The problem statement, all variables and given/known data
A cat running at a constant speed of 5.0 m/s runs by a dog sitting on a driveway. The dog gives the cat a chase 5 seconds later and accelarates at 0.5 m/s^2
a) how long does it take the dog to catch upwith the cat? b) how far from the driveway is the dog when it catches the cat

I know I have to make a data table but I dont know what to plug in and what equations to use please help

2. Sep 21, 2009

### jambaugh

I don't know about a data table. I would tackle this using the equations of motion.

Have you studied equations of motion under constant acceleration?

$$x(t) = x_0 + v_0 t + \frac{1}{2} a\cdot t^2$$
where x(t) is the position at time t given initial position, initial velocity and constant acceleration.

The cat is not accelerating so its position will be simply:
$$x(t) = x_0 + vt$$

Write the equation for the dogs and cats position as a function of time. You may want to set t=0 to be when the dog starts running (figure out how far ahead the cat is at this time and that is the cat's initial position).

The dog catches the cat when they both have the same position.

3. Sep 21, 2009

oh ok thanks alot i understand it now