- #1

A_User

- 5

- 0

I wrote a program that adds force to a car like so:

*Engine Force = Power / Velocity*

*Drag Force = -Velocity²*

*Net Force = Engine Force - Drag Force = Power / Velocity - Velocity²*

I'd like to determine power based on how fast I want a car of a given mass to reach a given speed, for example, 0 to 20 m/s in 10 seconds for a 1000 kg car. Here is the equation I came up with to calculate the power needed without any drag:

*Power = Force * Velocity = (Mass * Average Acceleration) * (Displacement / Time)*

Using it to solve the example:

*Mass = 1000 kg*

*Time = 10 s*

*Average Acceleration = 20 / 10 = 2 m/s*

*Displacement = 0.5 * 2 * 10² = 100 m*

*Power = (1000 * 2) * (100 / 10) = 20,000 W to go from 0 to 20 m/s in 10 seconds for a 1000 kg car*

I'm not sure if that equation is sound but it calculates the power needed in a vacuum perfectly. The problems start when I try to account for drag:*Net Force = Power / Velocity - Speed²*

*Net Force + Speed² = Power / Velocity*

*Power = (Net Force + Velocity²) * Velocity*

*Power =*

*[(Mass * Average Acceleration)*+*(Displacement / Time)*²] **(Displacement / Time) = 21,000 W*This is incorrect. When I run my program with the calculated power, it takes the car ~11.5 s to reach 20 m/s, not 10 s. By trial and error, I determined that the correct value is ~23,500 W.

I've tried all day to find the correct equation with no success. What am I doing wrong?