Exploring Increased Engine Load and Air-Fuel Ratio Changes on the Race Track

In summary, there is a fair amount of empirical evidence in my world that people experience their Air-Fuel ratio going lean as they go down the race track (drag race). There are all kinds of theories that abound, but none of them make any sense (at least not to me). Most of theories have to do with the amount of load on the engine. Some people think that the engine works harder because of the increased cylinder pressure. Others think that the engine works harder because the fuel is being used more quickly. My assumption is that the engine consumes the same amount of air per unit time, but just consumes more total air over the run when the run takes longer. So, in theory, the engine could work harder under some circumstances.
  • #1
TexanJohn
52
0
There is a fair amount of empirical evidence in my world that people experience their Air-Fuel ratio going lean as they go down the race track (drag race). There are all kinds of theories that abound, but none of them make any sense (at least not to me). Most of theories have to do with the amount of load on the engine.

So, here is my thought exercise: Assume we have an engine hooked up to the dyno, and that the dyno has complete variable load capability. We are going to run several sweep tests from 1000 to 6000 RPM. Thus, we start by putting some load (Z) on the engine, go full throttle, and then let the dyno sweep the engine from 1000 to 6000 RPM’s. Then we repeat this test, but we double Z (i.e. 2Z). So, the test takes twice as long. Then, we triple Z (i.e. 3Z) so that the test takes 3 times as long. Is the engine actually working harder during the 3Z and 2Z runs as opposed to the Z run? If so, how or where could this increase load be measured within the engine? I ask within the engine because my assumption is that the Manifold Absolute Pressure (MAP) will approximate atmospheric pressure under each test, and that is the best measure of ‘load’ that I know.​

Thoughts?
Is there increased cylinder pressure?
Is there something to do with the rate of acceleration (e.g. 3rd gear in a vehicle takes longer to get from 1000 to 6000 RPM than 1st gear)?

One of my assumptions is that for any given point in time, the engine consumes the same amount of air per unit time, it just consumes more total air over the run when the run takes longer.
 
Engineering news on Phys.org
  • #2
Yes. If you're putting a higher load on the engine each time, the engine is working harder each run. The easiest way to measure would be cylinder pressure, although exhaust temperatures are a good tell-tale sign too.

Imagine just revving your engine with no load. 6000 rpm is quickly attainable because there's no load. The engine isn't working very hard at all, - if you take it to 6000 rpm and leave it there (please don't!), it's only burning fuel to overcome all the normal losses (friction, pumping, ancillaries etc). Cylinder pressure will be just sufficient to keep the engine turning. If you're on the motorway at a steady 6000rpm then you've got all that to overcome, plus a shedload of air, rolling and transmission resistance. In this case your engine will obviously be working much harder, and cylinder pressures (and temperatures, and exhaust temperatures...) will all be higher too.

If you can't get your head round transient running, just think of steady state and brake it down.

Your final assumption is wrong with a gasoline engine, remember, you need to throttle the engine back in order to reduce the air flow to keep the mixture near stoichiometric to prevent misfire.

Enjoying the enginey discussion! I'll have a bit more of a think about this all.
 
  • #3
TexanJohn said:
One of my assumptions is that for any given point in time, the engine consumes the same amount of air per unit time, it just consumes more total air over the run when the run takes longer.
If this were true, there would be no need for mass airflow sensors on cars. As fuel demand increases, airflow must increase.

I agree that EGT is a good indication of load.

In regards to the going lean issue, my guess would be that as the vehicle goes down the track, it's acceleration is decreasing and thus the load. Lower load means less fuel required and the fuel control throttles back. There may be a lag between air flow response and fuel rate decrease which is the lean condition.
 
Last edited:
  • #4
FredGarvin said:
If this were true, there would be no need for mass airflow sensors on cars. As fuel demand increases, airflow must increase.


Or a lambda sensor, - using a feedback loop to adjust the amount of fuel injected to keep the oxygen content in the exhaust to zero (just). Remember that on a gasoline car, the driver input controls the air regulation and not the fuel regulation.
 
  • #5
FredGarvin said:
If this were true, there would be no need for mass airflow sensors on cars. As fuel demand increases, airflow must increase.

Why does fuel demand increase? As I understand it, measure the air, then decide what air-fuel ratio you want, then turn the injectors on long enough to achieve that air-fuel ratio, AFR.

I might not have stated this quite clearly. In each run (Z, 2Z, 3Z), there will be at point that I would define by RPM and Manifold Absolute Pressure, MAP. Since, the engine is at wide open throttle (WOT), it is a one row/column (depending on whether MAP is the row or the column value) table. Something like:

Code:
[B]Volumetric Efficiency Table - VE[/B]
-

         [B]MAP (kPa)[/B]
-        [U]100[/U]
-
[B]RPM[/B]
1000     60
1500     70
2000     75
2500     80
3000     85
3500     90
4000     95
4500     97
5000     100
5500     99
6000     97

During each sweep run, the engine will be move down this table as RPM increases. Essentially, the VE table facilitates the calculation of fuel by estimating the amount of air (volume) entering the engine. This combined with the ideal gas law can determine how much fuel is needed to obtain a given AFR. *Note - not all cars use Mass Airflow Sensors. :)

Now, the length of time that the engine would remain in given cell would be determined by the rate of acceleration. On the 3Z run, it is in each cell longer than 2Z or Z.

Let's assume that the value of 85 which corresponds to 3000 RPM and 100 kPa represents enough fuel to achieve an AFR of 12:1. I can put the engine on the dyno, and confirm that 85 is the correct value in the table to achieve a 12:1 AFR.

How can the engine ingest any more air than what it ingests at 3000 RPM and 100kPa? How is this dependent upon the load that is placed on the engine?

I would state it as something along the lines of "At 3000 RPM's and 100kPa the engine consumes 300 grams of air per second when the air temp is 59* F and the engine is tested at sea level." So, how/why is more fuel needed on the 3Z run? Because the engine operates in that cell (i.e. operating conditions) longer?
 
  • #6
The amount of air and fuel consumed should be rougly proportional to RPM (and it should be obvious why).

Now, I'm curious about what kind of cars are being drag raced. For natually aspirated (no turbocharger, supercharger, or turbonator) engines, there is going to be a loss of volumetric efficiency at higher RPM due to air flow limitations.

Similarly, there are limitations on the cam profile and valve travel speed. Fuel injection systems can usually keep up with the engine just fine, but the software and air flow sensors might not be set up with drag racing (or retrofitted air systems) in mind.
 
  • #7
NateTG said:
The amount of air and fuel consumed should be rougly proportional to RPM (and it should be obvious why).

Actually, I need it spelled out in simple terms. I have my own ideas, assumptions, and things I've learned, but sometimes those concepts might be wrong. Are you saying that the Air/Fuel consumption will be the same? As I understand the other posts, they are saying more would be consumed in the 3Z run vs 2Z vs Z.

NateTG said:
Now, I'm curious about what kind of cars are being drag raced. For natually aspirated (no turbocharger, supercharger, or turbonator) engines, there is going to be a loss of volumetric efficiency at higher RPM due to air flow limitations.

Similarly, there are limitations on the cam profile and valve travel speed. Fuel injection systems can usually keep up with the engine just fine, but the software and air flow sensors might not be set up with drag racing (or retrofitted air systems) in mind.

I would prefer to not use specific vehicles and simply talk about the events that are occurring. If we need specific vehicles, I will offer up these two (as I work with late model GM vehicles):

2001 Corvette with a highly modified 402” (~6.5L) engine (NA) with a measured 620HP and 585ft/lbs measure on an engine dyno. The car runs the ¼ mile in ~10.8 seconds with MPH ~129. The car weighs ~3450 with driver.​

2005 Avalanche with a basically stock 325” (5.3L) engine. This truck runs the ¼ mile in approximately 17.5 seconds at 88MPH. HP ~295, TQ ~330, Weight w/driver ~5000​

In either case, both engines will operate at 5000 RPM’s and 100kPa while going down the track. They will do so in 1st, 2nd, and 3rd gear. The throttle is all the way open each time this operating condition is met. Is the engine actually working ‘harder’ in 3rd gear? Is more air being consumed? How? Is the engine making any more power or torque in 3rd?

If we say that the engine is working ‘harder’ in 3rd gear (or 3Z in the above examples), ‘what’ is actually occurring, or how could I determine this. Because, my experience tells me that I will get the same air consumption reading from my scanner at 5000 RPM, 100kPa (MAP), with the throttle 100% open regardless of gear. The engine doesn't know (or care) what gear I am in. Does it? :)
 
  • #8
TexanJohn said:
Actually, I need it spelled out in simple terms. I have my own ideas, assumptions, and things I've learned, but sometimes those concepts might be wrong. Are you saying that the Air/Fuel consumption will be the same? As I understand the other posts, they are saying more would be consumed in the 3Z run vs 2Z vs Z.

I'm not heavily into automotive stuff, so I really, don't have any practical experience with what exactly people mean when they're talking about stuff.

Clearly the runs for Z2 and Z3 are going to take longer than the run for Z1, and so will (in total) consume more fuel.

Regardless. Conventional auto engines are essentially pumps - on every intake stroke, the engine is pulling in a fixed volume of air, and on every exhaust stroke, the same amount of exhaust is going out into the exhaust. Roughly speaking, the engine is going to pump out half it's displacement every revolution. (A 2 stroke engines will do the entire displacement.)

2001 Corvette with a highly modified 402” (~6.5L) engine (NA) with a measured 620HP and 585ft/lbs measure on an engine dyno. The car runs the ¼ mile in ~10.8 seconds with MPH ~129. The car weighs ~3450 with driver.​

So, for example, this Corvette that you have, is running at roughly 5500 RPM during the test, and will be pumping through roughly 18,000 liters (6.5 Liters * 5500 RPM / 2 ) of air every minute while it's running at 620HP and 585ft/lbs. (Since your manifold pressure is 100 kPa I'm assuming that you don't have driven air.)

I would prefer to not use specific vehicles and simply talk about the events that are occurring.
Certainly, but it wasn't clear to me whether we were talking about street cars, dragsters, or mortorcyles.

Is the engine actually working ‘harder’ in 3rd gear? Is more air being consumed? How? Is the engine making any more power or torque in 3rd?

If we say that the engine is working ‘harder’ in 3rd gear (or 3Z in the above examples), ‘what’ is actually occurring, or how could I determine this.

The engine could produce more torque or power if the pressure inside the cylinder is higher. This would show up in the exhaust temperature or pressure. In order for this to take place, the engine would have to be running richer at the higher load.

N.B.:
Controlling the combustion temperature is, of course, much easier with direct injection / compression ignition engines. It's relatively easy to set up a diesel motor so that it maintains roughly constant RPM (and air consumption) by controlling combustion heat through the amount of fuel that is injected.

There are other ways that cylinder pressure could also be increased, but that are almost certainly not a consideration here like charge cooling, or water/nitrous injection.
 
Last edited:
  • #9
NateTG said:
I'm not heavily into automotive stuff, so I really, don't have any practical experience with what exactly people mean when they're talking about stuff.

That's ok, I'm not really into all the engineering side of it. Not yet anyway. :)




NateTG said:
The engine could produce more torque or power if the pressure inside the cylinder is higher. This would show up in the exhaust temperature or pressure. In order for this to take place, the engine would have to be running richer at the higher load.

This is sort of my question. Running richer than stoich can't produce any more power. In fact, the only reasons I can think of to run rich: 1) Thermal management which is extremely important; most cooling systems in cars could not 'keep up' with burning a stoich mixture and the increased temperatures vs a rich mixture, 2) to ensure that there is sufficient fuel (i.e. if we were trying to be right on the edge and run a stoich mixture, there might be some inefficiencies which would cause us to not maximize power; for instance more air entering the engine than we measured)

Stoich for gasoline ~14.7:1, but most applications I am familiar with (like the 2001 vette) run an AFR of 12.5-13:1 under WOT. If you go much over 13:1 for very long, the engine will experience something fatal. The excess heat will lead to detonation and something will let go. Ask me how I know. :grumpy: :bugeye: :yuck:

Let's look at one, single combustion event. If I have 1.3 grams of air, and I want a 13:1 AFR then I need .1 grams of fuel. I assume there is a finite amount of energy that this air/fuel can create/release. All of the fuel will not be consumed. I also assume that nothing magical is going to happen to increase the engine's ability to convert this energy into mechanical work.

So, what is occurring in 3Z or in third gear that isn't occurring in Z or 1st gear. If we say the engine is working 'harder', 'how' is it doing it? What is physically occurring? Is it pushing the piston harder? Or, is the piston traveling slower because of the increased load; thus, heat increases because of the prolonged time for the event to occur.

I will try to get some empirical data from my Avalanche and post it.


NateTG said:
N.B.:
Controlling the combustion temperature is, of course, much easier with direct injection / compression ignition engines. It's relatively easy to set up a diesel motor so that it maintains roughly constant RPM (and air consumption) by controlling combustion heat through the amount of fuel that is injected.

There are other ways that cylinder pressure could also be increased, but that are almost certainly not a consideration here like charge cooling, or water/nitrous injection.

What is N.B. ? Diesels are interesting. I am not for sure that water injection increases cylinder pressure. Nitrous certainly can. :)
 
  • #10
What is N.B. ? Diesels are interesting. I am not for sure that water injection increases cylinder pressure. Nitrous certainly can.

N.B. = Nota bene.

http://en.wikipedia.org/wiki/Water_injection_(engines)

What makes you (or your friends) say that the engine is running lean? Also, is it fuel injected or carbeurated?
 
  • #11
NateTG said:
N.B. = Nota bene.

http://en.wikipedia.org/wiki/Water_injection_(engines)

What makes you (or your friends) say that the engine is running lean? Also, is it fuel injected or carbeurated?

I wasn't thinking about the steam aspect. I haven't setup any water/meth injection kits, but I hope to in the near future. Although, I will say that I have never seen any of the advertisements make the claim from wikipedia:

wikipedia said:
But the greater effect comes later during combustion when the water takes in significant amounts of heat energy as it converts from liquid to gas (steam), increasing piston pressure (torque) and reducing the peak temperature with its resultant NOx formation as well as the amount of energy absorbed into the cylinder walls.

Typically you will only see something like this:

ad said:
Power is increased through:

a. Intake charge cooling - Water/methanol will lower air charge temps over 200°f in this application.
b. Methanol - this acts as a fuel as well as cooling the intake charge.
This subject has come up several times on various mail lists or forums I frequent. I haven't really gotten any good data from someone to show. There are people that basically swear that it is a fact.

I have a car on a dyno with an eddy current in which the acceleration rate is held constant. Thus, it is somewhat like going down the race track in the sense that a pull in 3rd takes longer than a pull in 2nd etc. Doing back-to-back-to-back pulls, I saw no difference in AFR. Yet, I kept hearing/reading all these complaints about getting lean going down the track. However, the dyno is not exactly like the track as the rolling resistance is not the same, wind resistance, I was able to keep the intake air temp constant, but going down the track it might change either up or down depending on several factors, etc.

So, I have been trying to conceptualize what is really occurring. Is the motor working harder in 3rd gear? How would I know, or what would I measure? Is it a problem with the tune? How can I avoid it? etc.

Just trying to learn more about the physics and engineering. :)

All the cars I work on are fuel injected.
 
  • #12
TexanJohn said:
I have a car on a dyno with an eddy current in which the acceleration rate is held constant.

Thus, it is somewhat like going down the race track in the sense that a pull in 3rd takes longer than a pull in 2nd etc. Doing back-to-back-to-back pulls, I saw no difference in AFR. Yet, I kept hearing/reading all these complaints about getting lean going down the track. However, the dyno is not exactly like the track as the rolling resistance is not the same, wind resistance, I was able to keep the intake air temp constant, but going down the track it might change either up or down depending on several factors, etc.

You should be seeing acceleration drop as the car speeds up unless the car is tire-limited, so I'm not sure that constant acceleration is a good model for a drag race.
How long pulls (I'm assuming a pull is what happens between shifting into and out of a gear) take is going to depend on the gear ratios.

So, I have been trying to conceptualize what is really occurring. Is the motor working harder in 3rd gear? How would I know, or what would I measure? Is it a problem with the tune? How can I avoid it? etc.

Well, you're already measuring air intake (pressure, temperature, and volume), you can get usefull information from measuring exhaust pressure and temperature as well. I doubt that there's a whole lot of difference between engine performace at 0 and 100 mph.

All the cars I work on are fuel injected.
Then (assuming suitable programming) they should never run too lean.
 
  • #13
TexanJohn said:
Why does fuel demand increase? As I understand it, measure the air, then decide what air-fuel ratio you want, then turn the injectors on long enough to achieve that air-fuel ratio, AFR.
I am going on the assumption that the engine will keep producing enough torque to overcome the applied load. The only way to do that is to add more fuel. Granted, I am used to governing on the fuel side and not the air side, like a car engine does. Nate brought up a good point about the engine essentially being a pump. I have to remember to think of recips like that.
 
Last edited:
  • #14
FredGarvin said:
I am going on the assumption that the engine will keep producing enough torque to overcome the applied load. The only way to do that is to add more fuel.

As specified in this discussion we're talking about an engine being run at the same RPM and manifold pressure but with different loads, and you're suggesting that an engine under a higher load is going to produce more power. This seems very unlikely to me.

Since engines have internal inertia, an engine that is being spun up under load will produce more usable power.

Granted, I am used to governing on the fuel side and not the air side, like a car engine does.

Spark ignition engines are practically limited to a relatively small range of fuel air mixtures since running too lean causes pre-ignition (knocking) and running too rich is wasteful. Compression ignition engines don't have pre-ignition problems because the fuel is not added until the desired time of ignition, so compression ignition engines can be run very lean. That means that, with appropriate engine management, compression ignition engines can be very fuel efficient.

Nate brought up a good point about the engine essentially being a pump. I have to remember to think of recips like that.

However, once again, looking at the IC as a pump, it should be clear that for a given RPM the engine is going to be displacing a fixed amount of fuel-air mixture. Practical limits on spark ignition engines mean a roughly constant energy per charge (or per displacement). As a consequence, in order to get more performance, and given the constraints of this context you've got to improve thermodynamic efficiency to get more power.

There are ways to improve thermodynamic efficiency, but they're really eliminated by the fact that we're looking at running the same engine at (effectively) the same temperature with the same exhaust system in both situations.

Engine modifications made to increase power generally function by increasing displacement, improving volumetric efficiency, or reducing internal losses.
 
  • #15
I'm loving this thread!

Got to give it some proper thought, but before we get carried away I just want to pick up a few important points. Please don't feel I'm nitpicking but I think these are relevant to some basic understanding.

NateTG said:
Spark ignition engines are practically limited to a relatively small range of fuel air mixtures since running too lean causes pre-ignition (knocking) and running too rich is wasteful. Compression ignition engines don't have pre-ignition problems because the fuel is not added until the desired time of ignition, so compression ignition engines can be run very lean. That means that, with appropriate engine management, compression ignition engines can be very fuel efficient.

One to bear in mind here is that on a macro scale, CI engines run lean (ie overall air/fuel ratio is lean). On a combustion scale though, the burning is rich, - the centre of the fuel droplet is essentially all fuel and no air, and away from the droplet is essentially all air and no fuel. The 'mixture' at the site of combustion is actually close to stoichiometric, but there are large pockets of air with no fuel present which give the Diesel engine its lean burn characteristic. Contrast this with a SI engine where the fuel is homogeneously distributed throughout the whole charge (except in a GDI but we all know how that went!).

However, once again, looking at the IC as a pump, it should be clear that for a given RPM the engine is going to be displacing a fixed amount of fuel-air mixture.

'Amount' in terms of mass flow, not volume flow!
 
  • #16
brewnog said:
'Amount' in terms of mass flow, not volume flow!

Volume flow. We're talking about cc's, not grams. That's why people care about volumetric efficiency (energy per volume) often at the expense of thermodynamic efficiency (energy per fuel mass).

An intercooler for example is a way to dump some of the energy that is used compressing the air (thermodynamically negative efficiency) so that the air becomes colder and more dense (volumetric positive efficiency).
 
  • #17
NateTG said:
Volume flow. We're talking about cc's, not grams. That's why people care about volumetric efficiency (energy per volume) often at the expense of thermodynamic efficiency (energy per fuel mass).

Exactly. An engine running at 3000 rpm with no load will have less mass air flow than at 3000 rpm at 50kW.

Volumetric efficiency isn't a measure of energy per volume, it simply expresses how effective the breathing process is.
 
  • #18
brewnog said:
Exactly. An engine running at 3000 rpm with no load will have less mass air flow than at 3000 rpm at 50kW.

Huh? Unless you're throtleing fuel or air flow, an engine is going to be producing the same power at the same RPM. Without a load, the engine is going to be putting that power into spinning itself, or tearing itself apart rather than moving the load.

Volumetric efficiency isn't a measure of energy per volume, it simply expresses how effective the breathing process is.

Well, the amount of energy per displacement is proportional to the air mass per displacement so the two are closely related.
 
  • #19
NateTG said:
Huh? Unless you're throtleing fuel or air flow, an engine is going to be producing the same power at the same RPM. Without a load, the engine is going to be putting that power into spinning itself, or tearing itself apart rather than moving the load.
Yep, and spark ignition engines are throttled by air flow; if it can develop 50kW at 3000rpm at WOT then to develop 10kW at 3000rpm it needs to be throttled. You obviously know this, it just looked like there was some confusion going on.

Well, the amount of energy per displacement is proportional to the air mass per displacement so the two are closely related.

Ok yes, I see where you're coming from.
 
  • #20
brewnog said:
Yep, and spark ignition engines are throttled by air flow; if it can develop 50kW at 3000rpm at WOT then to develop 10kW at 3000rpm it needs to be throttled. You obviously know this, it just looked like there was some confusion going on.

Well, yes, but TexasJohn was talking about drag racing where (unless there's a limitation caused by the tires on the road, or some other external factor) there should be no throttling (i.e. the throttle should be wide open).
 
  • #21
Sorry to bring up an old thread, but there was atleast one one unanswered question that needs answering.

I am a dyno operator/engine tuner and avid motorsport enthusiast. There was a proposed question of why a car runs leaner at the end of a race track. The simple answer is: As an engine operates through a "cell" (load and rpm point) in a fuel map, the amount of time that the engine operates between cells takes longer when more resistance is applied. Meaning in first gear I can accelerate from 1000 to 6000 RPM much faster and therefor interpolation between cells is done quickly. When I run the same car/engine from 1000 to 6000 RPM in 3rd gear, I am only completely in the tuned cell (at the specific load and RPM point) for a brief time. The rest of the time is spent interpolating between the cells.

There is a general linearity between cells, but when we have two cells, one at 3000 and one at 4000 RPM, interpolation takes 50% from each cell when the engine operates from 3001-3999 RPM. If we had a Volumetric Efficiency value of 40 at 3000 RPM and a value of 60 at 4000 RPM, then the VE from 3001-3999 will be 50% of 40 plus 50% of 60 which comes out to 50% VE. So is the engines VE at 3113RPM actually 50% ?

ALSO,
As an engine operates from low to high RPM the measured manifold pressure will not remain constant. As an engine tuner, I find that the available pressure remains, but there is a slight depression inside the intake manifold. This is overly exagerated when exhaust pressure exceeds ambient conditions (back pressure). There is a great way to measure engine load using MAP/BARO(barometric pressure). IF BARO is measured at 100 kPa and manifold pressure reads 100 kPa, then load would be equal to 100/100 or 100%, if BARO is 100 kPa and MAP pressure was 90 the load percentage would be 90%. We can also toss in a 4th axis and compensate for EMAP (exhaust manifold absolute pressure).

So what I'm getting at is that the original poster assumed that his engine would actually maintain a 1:1 pressure ratio from inside to outside of the intake manifold while maintaining wide open throttle. As I state, this is not normal at high RPM operation.
 

1. How does increasing engine load affect performance on the race track?

Increasing engine load can result in a higher power output, allowing the car to accelerate faster and reach higher speeds on the race track. However, this also puts more strain on the engine and can lead to increased fuel consumption.

2. What are the potential benefits of adjusting the air-fuel ratio on the race track?

Adjusting the air-fuel ratio can improve engine efficiency and power output, leading to better performance on the race track. It can also help prevent engine knock and reduce emissions.

3. How do changes in air-fuel ratio affect engine temperature on the race track?

Changes in air-fuel ratio can impact engine temperature by altering the combustion process and the amount of heat generated. A leaner air-fuel ratio can lead to a hotter engine, while a richer ratio can help cool the engine.

4. Is it safe to make significant changes to engine load and air-fuel ratio during a race?

Making significant changes to engine load and air-fuel ratio during a race can be risky, as it can put extra strain on the engine and potentially lead to mechanical failures. It is important to carefully monitor and adjust these factors to ensure the safety and performance of the car.

5. How do environmental factors play a role in engine load and air-fuel ratio adjustments on the race track?

Environmental factors such as temperature, altitude, and humidity can affect engine load and air-fuel ratio, as they can impact the density of the air and the amount of oxygen available for combustion. These factors should be taken into consideration when making adjustments on the race track.

Similar threads

  • Mechanical Engineering
Replies
11
Views
19K
  • Mechanical Engineering
Replies
3
Views
60K
  • Mechanical Engineering
Replies
6
Views
1K
  • Mechanical Engineering
Replies
5
Views
9K
Replies
13
Views
5K
Replies
6
Views
3K
Replies
14
Views
3K
Replies
4
Views
4K
Replies
1
Views
1K
  • Thermodynamics
Replies
3
Views
2K
Back
Top