- #1

pellman

- 684

- 5

This is my stab at it. Pick a sample period of time. Calculate the total number of hours flown, that is, the total time spent in the air by all people. (100 people on a 1 hour flight would contribute 100 hours to the total.) Actually, since accidents occur on the runway too, that should be time spent aboard a commercial plane, not just in the air.

Now do the same calculation for total time spent in autos. Then divide each by the total number of fatalities in airplanes/cars during that time period. If S_F is the time-sum for flying, N_F the number of fatalities, and S_D and N_D the corresponding numbers for driving, then

[tex]\frac{\frac{N_F}{S_F}}{\frac{N_D}{S_D}}[/tex]

would be a ratio of your chance of being killed in an airplane per unit time vs your chance of being killed in a car crash per unit time. It has to either be per-time or per-mile to reflect that you have more of a chance of getting killed the longer your voyage, but perhaps per-mile is better. That way you can compare: what is safer? Driving from NY to LA, or flying from NY to LA?

So (1) am I right? and (2) is there a better way to do this?