Originally Posted By: Reddy45
So if a vehicle has a 150a alternator, how much of that capacity is used by the vehicle when it is running?
I assume most vehicles draw all 12v power off the battery for voltage regulation, but it doesn't make sense that the battery would be drawing a constant 150 amps during operation?
Whatever is needed. If there is a 10A load, it delivers 10A. If a 150A load, then 150A ... ignoring, for the moment, whatever part of the load that can be delivered by the storage battery, or in newer vehicles, supercapacitors, etc. Generally not much, or not for long, so safely ignored.
As the load (and therefore output) increases, the load on the engine (and therefore the HP required to drive the alternator) increases.
However, there is no correlation (within reason) between the maximum capacity of the alternator and the power it takes to drive it, or in other words the MPG your engine is achieving. It uses what it needs, no more. So installing a 300A alternator won't use more power than the 150A unit it replaced, as long as the demand remains at 150A or lower.
That is why electric oil pumps, water (coolant) pumps, and power steering pumps are showing up in newer vehicles. Since they are variable load pumps they are more efficient than the steady load mechanically driven units, including belt or chain driven units, they replace.
In particular the electric motors use much less engine power partly because the alternator that drives them is load dependent (via the voltage regulator), not engine RPM driven. The parasitic loss of water pump and engine fan increases as RPM increases, and can be many HP (60, 80 in a modern V8) versus a much lower load via the alternator, that can go to near zero if the electric load is lightened.