1. Jun 27, 2016

### Brian James

I'm trying to determine the amount of load (power loss) a theoretical engine will suffer from an AC alternator, such as one you would find on an automobile engine, they produce something of 40 amps with 110 volts I believe? To be specific, I'm curious to know how much a theoretical engine would suffer from an alternator when it is at maximum capacity for only .005- .010 of a second. I understand a lot more goes into it but if any one has a general idea of what I could expect that would be appreciated

2. Jun 27, 2016

### Staff: Mentor

Power is already a time rate, so the duration doesn't matter. You have specified volts and amps: do know how to calculate power from that?

3. Jun 28, 2016

### mrspeedybob

Most automotive alternators are regulated to output 14 volts +/- 0.5 with a maximum current output in the 100 to 140 ampere range.

4. Jun 29, 2016

### CWatters

If we assume the alternator generates 14V at 100A then the power delivered by the alternator is 1400W. An alternator isn't 100% efficient so the engine will need to produce more power than that. This paper suggests alternator efficiency ranges from 55% to 80% and the belt driving it around 97% efficient.

http://www.delcoremy.com/documents/high-efficiency-white-paper.aspx

So if we assume 55% the engine would have to deliver 1400 * 100/55 * 100/97 = about 2600W.

That's the power the engine would have to generate if the alternator was running continuously at 1400W. If it's just for 0.01 seconds that equates to 2600 * 0.01 = 26 Joules. For reference a AA battery might contain about 5400 Joules of energy.

However what does the alternator do the rest of the time? Typically it's still turning so the engine will still experience losses and these could be considerable and more important than what it's doing for the 0.01 seconds.