Why are most electrical sources, voltage sources?

AI Thread Summary
Most electrical sources are modeled as voltage sources because they provide a constant voltage regardless of the current drawn, which is essential for devices that require stable voltage levels, like those in cars and homes. Constant voltage sources (CVS) minimize losses compared to constant current sources (CCS), as losses in CVS are primarily due to resistance, while CCS can lead to higher losses since voltage varies with load. CVS also allows for flexibility in connections, as devices can be disconnected without affecting voltage output, unlike CCS, which requires a load to function properly. Generators and batteries typically operate more efficiently in CVS mode, while specialized applications like nuclear batteries may utilize CCS. Overall, CVS is favored in electrical systems for its efficiency and stability.
sighman
Messages
4
Reaction score
0
On wikipedia:

http://en.wikipedia.org/wiki/Current_source

Under the heading, "Current and voltage source comparison"

It says that most sources of electrical energy are best modeled as voltage sources, that is supplying constant voltage. But why do most sources provide constant voltage as opposed to constant current (i.e. current source) ?
 
Engineering news on Phys.org
An ideal voltage source produces a constant voltage regardless of the current drawn from it.

If all the devices in a car need 12 volts, for example, then they need 12 volts regardless of what other devices are drawing current.

This is the same for devices in the house. They need the mains voltage to stay the same. Otherwise, the lights would dim every time the refrigerator turned on.

Constant current sources have their uses, but these are pretty specialized.

If you charge a capacitor with a constant current, the voltage across the capacitor increases linearly instead of exponentially.
If you send a constant current through a resistor, then the voltage across the resistor depends on the size of the resistor.

Current transformers used to measure current in mains power sources are current sources and must have a load on them or they will produce very high voltages across their secondaries.

A truly constant current source could be a dangerous thing. A 1 amp current source would produce a million volts across a 1 megohm resistor.
Fortunately, most current sources are just a voltage source with something in series to regulate the current, so they can't generate more voltage than the voltage source can supply.
 
In theory, many circuits have what is called a "dual" which would have current sources rather than voltage.

In practice, the most obvious reason is that you would need to always have something connected to the source. With voltage sources, you can pull out all the wires and it maintains voltage - with a current source, pulling out all the wires would be analogous to shorting a voltage source.
 
Electrical sources are commonly CVS, constant voltage source, by design, not nature. The ac generators that provide power to our homes & businesses could just as easily be configured for CCS, constant current source operation. CVS works better.

With CVS, the insulation losses are computed by V^2*G, where G is the conductance, or parallel resistance across the conductors. This is very small in comparison to the conduction losses, computed as I^2*R. R is the conductor resistance.

With CVS, the generator is always outputting full voltage, but current varies w/ loading. This keeps losses lower than would be if CCS were used. With CCS, full current is always present, & voltage varies per loading. Losses are much greater w/ CCS mode of operation.

In addition, the power system uses transformers to step up to a very high voltage for transmission, then step down xfmrs are used for local distribution.

With batteries, a similar case exists. Batteries could be designed for CCS operation, but losses are too high, regulation is not as good as CVS. Primary cells like NiMH, Li ion, alkaline, CZn, NiCd, etc., work best in CVS mode. Nuclear batteries, aka "nucells", OTOH, work best in CCS mode. Search under "nucell" for details.

A car alternator can be a CCS or a CVS, likewise for the power plant generator. For CCS mode, spin with constant torque. As loading changes, so does the speed. Torque is related to current, & speed to voltage. Constant torque produces CCS, & voltage/speed vary w/ load.

Spinning the generator/alternator at constant speed produces CVS mode. In addition, constant frequency is obtained due to constant speed. On the power grid, the freq is very precise making synchronous motors run at steady speed.

Did I help?

Claude
 
Last edited:
I see, that makes sense, thanks :smile:
 
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...
Back
Top