There is a fair amount of empirical evidence in my world that people experience their Air-Fuel ratio going lean as they go down the race track (drag race). There are all kinds of theories that abound, but none of them make any sense (at least not to me). Most of theories have to do with the amount of load on the engine. So, here is my thought exercise: Assume we have an engine hooked up to the dyno, and that the dyno has complete variable load capability. We are going to run several sweep tests from 1000 to 6000 RPM. Thus, we start by putting some load (Z) on the engine, go full throttle, and then let the dyno sweep the engine from 1000 to 6000 RPM’s. Then we repeat this test, but we double Z (i.e. 2Z). So, the test takes twice as long. Then, we triple Z (i.e. 3Z) so that the test takes 3 times as long. Is the engine actually working harder during the 3Z and 2Z runs as opposed to the Z run? If so, how or where could this increase load be measured within the engine? I ask within the engine because my assumption is that the Manifold Absolute Pressure (MAP) will approximate atmospheric pressure under each test, and that is the best measure of ‘load’ that I know. Thoughts? Is there increased cylinder pressure? Is there something to do with the rate of acceleration (e.g. 3rd gear in a vehicle takes longer to get from 1000 to 6000 RPM than 1st gear)? One of my assumptions is that for any given point in time, the engine consumes the same amount of air per unit time, it just consumes more total air over the run when the run takes longer.