- #1
electrodruid
- 17
- 0
I'm a game programmer trying to write code to simulate a variety of cars braking at different speeds and on surfaces with different coefficients of friction. For now, consider that I'm interested in a fairly wide range of cars (people carriers, hatchbacks, saloons, sports cars etc) but that they're all equipped with the same decent-quality tyres and are driving in a straight line on dry asphalt. I'm not considering thinking time, just the actual braking time/distance.
So, the Wikipedia article on braking distance tells me that the distance is calculated like this:
.
The mass of the vehicle isn't a factor because it cancels out of the equations that lead to this one, gravity is constant, mu is (in my example) constant, so really everything is relative to v^2 and nothing else. So, at 60mph (26.8224 m/s), with mu = 0.7 and g = 9.8, the stopping distance is always going to be ~52.43 metres, right? Regardless of whether the vehicle is a Land Rover or a Ferrari.
Thing is, I've got a copy of Autocar magazine here, which features road tests for a whole bunch of different vehicles, including 60-0 braking times. I know I'm comparing braking distance to braking speed now, but their road tests say that a Ferrari F12 stops in 2.2 seconds, but a Land Rover Defender takes 3.5 seconds. Even comparing more similar vehicles like the Peugeot 3008 and the Peugeot 5008 yield braking times of 2.1 and 3.1 seconds respectively.
So, my questions:
1 - What factors could be causing this difference in braking time?
2 - How might those factors be included in an equation to calculate braking distance/braking time?
3 - Given a vehicle's braking time at a given velocity, is there a reasonable way to calculate its braking distance? Or vice versa?
So, the Wikipedia article on braking distance tells me that the distance is calculated like this:
The mass of the vehicle isn't a factor because it cancels out of the equations that lead to this one, gravity is constant, mu is (in my example) constant, so really everything is relative to v^2 and nothing else. So, at 60mph (26.8224 m/s), with mu = 0.7 and g = 9.8, the stopping distance is always going to be ~52.43 metres, right? Regardless of whether the vehicle is a Land Rover or a Ferrari.
Thing is, I've got a copy of Autocar magazine here, which features road tests for a whole bunch of different vehicles, including 60-0 braking times. I know I'm comparing braking distance to braking speed now, but their road tests say that a Ferrari F12 stops in 2.2 seconds, but a Land Rover Defender takes 3.5 seconds. Even comparing more similar vehicles like the Peugeot 3008 and the Peugeot 5008 yield braking times of 2.1 and 3.1 seconds respectively.
So, my questions:
1 - What factors could be causing this difference in braking time?
2 - How might those factors be included in an equation to calculate braking distance/braking time?
3 - Given a vehicle's braking time at a given velocity, is there a reasonable way to calculate its braking distance? Or vice versa?