Electronic devices operating voltage and current

  • Thread starter fisico30
  • Start date
  • #1
374
0
Hello Forum,

Some electronic devices specify a certain specific voltage that they need to work properly. Some other devices specify a specific current...
Why? Why don't they all specify the voltage or the current?

If the current is specified, the voltage will be automatically fixed by the internal resistance R: V=IR....At the end of the day all devices require power, which is the product P=IV....

A power supply usually has an output voltage and a current rating. What does the current rating represent? The max current the power supply can output?

Why does a power supply provide a fixed voltage but varying currents? Does the current depend on the load?

thanks,
fisico 30
 

Answers and Replies

  • #2
vk6kro
Science Advisor
4,081
40
A majority of electrical devices are designed to operate on a particular voltage.

For example a light bulb will operate below the voltage stamped on it, but the light will be at reduced intensity and may be orange tinted.
At the right voltage it should operate correctly

So, if it receives this voltage, the lamp itself will cause a certain current to flow and a certain amount of power to be consumed.
In this case, the voltage and power is usually given on the lamp, but the current is available from a simple calculation. Power = voltage times current. So current = Power divided by voltage.

There are a very few examples where the current is more important than the actual voltage.

Electrolysis is one. Arc welding is another.

In arc welding, there is very little voltage across the actual arc, but you could measure it. However, the current is controlled from within the welder case and the current may be shown on a switch setting or a meter. The current is important, so this is maintained even if the resistance of the load changes.

Power supplies are designed to give a constant voltage out (although good ones can also be set to give a constant current) and they supply this voltage to a load which then takes current according to Ohm's Law.
There is a limit to the current a power supply can give and this is usually printed on the power supply somewhere.
 
  • #3
374
0
Hello vk6kro,

I was just reviewing this old thread and thinking about your answer.

So, electronic devices function with a very specific optimal voltage. The current is automatically set once the correct voltage is provided.

If we throw too much or too little power to an electronic device, things can go wrong. Can all electronic devices function with a lower than recommended input power? Energy is still being provided but at a slower rate.

I still don't see why for some devices is more important to know the current or voltage...
Can you elaborate on that again? Both voltage and current are important to give the suitable power.

Thanks
fisico30
 
  • #4
NascentOxygen
Staff Emeritus
Science Advisor
9,244
1,072
Generalizations are always dangerous. :smile: :smile:

Generalizing to "ALL" is something no one would dare. Not even someone as bold and brave as vk6kro.
 
  • #5
374
0
I think you are right....

Maybe a need a few more specific examples to see where the distinction is important...

thanks
fisico30
 
  • #6
vk6kro
Science Advisor
4,081
40
Yes, it is too difficult to give a general answer to this.

You probably know what happens as the batteries in a transistor radio go flat.
Output sound decreases and may be distorted if you try to increase the volume.
When you do remove the old batteries you can still measure voltage on them, but it wasn't enough to keep the radio working well.

The important point, though, is that the radio will still work well if you put new batteries in it.

If you had somehow tried to run the radio from 12 volts instead of its normal 6 or 9 volts, you may have damaged it and restoring the correct voltage would not fix it.

With rare exceptions, most electronic devices are designed to work with the correct voltage on them, provided they have adequate cooling.
The current drawn is a consequence of the voltage and the circuit and is usually not controlled by the user.
 
  • #7
374
0
Thanks vk6kro.

As far as voltage conversion goes, let's say my device needs 20 V but my source is only able to provide 10 V. What can I do?
I can surely change the source and get one that outputs 20 V.

Or I can find what I call a"voltage converter". What is this voltage converter? Is it a passive device? Is it a small electronic component?

thanks,
fisico30
 
  • #8
vk6kro
Science Advisor
4,081
40
A voltage converter could be a switched mode power supply.
This is a small circuit on a printed circuit board which uses rapid switching of current through an inductor to produce a required voltage.

Sometimes you need to do this, but it should be a last resort. You have to be careful not to spend more on converting the voltage than it would cost to replace the high voltage device with one that worked on the lower voltage.

One example would be where you need to use a 24 volt device (say a GPS) in a 12 volt car.
It won't work on 12 volts, so you have to produce 24 volts somehow.

It isn't a pleasant situation, but it is time to get clever and get the job done.
 
  • #9
374
0
Thanks!

This is my situation: I have two solar cells connected in series to give me a certain output voltage, say 5 V (in good illumination conditions).
I cannot add more cells to get the right voltage because of space requirements. I am limited that max output voltage of 5 V.

I want to power an electronic device that requires 15 V. I feel like I need to have a voltage converter to change 5V to 15 V.
Also, the 5 V could be a fluctuating voltage. How do I ensure that whatever the output voltage of my solar cell system is, the voltage to the device is 15 V?

I need a voltage converter/regulator, right? It needs to take little place. Can I buy some radioshack components?

thanks
fisico30
 
  • #10
193
17
The 5 V you get, is that an open circuit voltage? Or is it measured when the solar cells have a load on?
 
  • #11
russ_watters
Mentor
20,560
7,209
Right -- do you know what the wattage of the device and solar cells is?
 
  • #12
374
0
Well, that is the open voltage of the solar cells.

Of course, with a resistive load attached to the cells, the voltage would go down and the current go up.
There is a value for V and I at which the cell outputs the most power....


But I guess that 5 V cannot be considered as the max voltage that can be provided to a connected device. the voltage will be smaller and dependent on the internal impedance of the device, correct?

Surely, if 5 V is the open load voltage and the electronic device needs 10 V, then we absolutely need a voltage converter...

thanks
fisico30
 
  • #13
370
8
But I guess that 5 V cannot be considered as the max voltage that can be provided to a connected device.
You guess? Look at the I-V curve of your solar cell. What's the output power at the open-circuit voltage of 5V?
 
  • #15
vk6kro
Science Advisor
4,081
40
There are DC to DC voltage boost modules available on EBay and other places. They are quite cheap, sometimes under $10, and very efficient.

If your 10v device used 10 watts, it would need 1 amp at 10 volts,
It would need an input of at least 2 amps at 5 volts (also 10 watts), plus whatever losses the converter introduced.

So you can see that there is much more current needed at 5 volts than you use at 10 volts.

These converters produce a steady output with varying input, but they do have minimum input voltage requirements. Also, the current will need to increase if the input voltage drops.
10 watts at 3 volts is 3.3 amps, for example..

If you only wanted 10 volts for a short time, you could charge a battery while you had good sunlight, although this adds extra costs and reduces efficiency.
 
  • #17
374
0
Hi vik6kro,

you mention that

It would need an input of at least 2 amps at 5 volts (also 10 watts), plus whatever losses the converter introduced.

Sure, let's say the device needs 10 W to properly function. These 10 W can be either
2 A and 5 V or 1A and 10V or even 10A and 1V.

But didn't you say that we are both constrained by the 10W number and also the voltage number, i.e. the device may specifically tell us that it needs 10 V. With 1V it may not work or the 10A current may be too high and damage the electronics.

So recommended power and voltage are the two numbers we need to match when connecting to a source....

some sources are constant current or voltage sources. A set of solar cells in series is neither: its output voltage or current depends on what load is attached to them.

Best,
fisico30
 
  • #18
vk6kro
Science Advisor
4,081
40
A fact that is technically necessary is that you can't get more power out of a converter ( or anything) than you put into it.
So, if you have 5 volts as your input voltage, as you said you did, then you need at least 2 amps in to get 10 watts out.
And the converters have losses, so more than 10 watts input would be needed.

DC to DC converters are sold on EBay and these sellers give very precise information about what input and output voltages and currents are possible with each converter.

So, if you want to use some other voltages, you would read the specifications to make a decision about each converter.
 
  • #19
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,964
5,269
Why does a power supply provide a fixed voltage but varying currents? Does the current depend on the load?

thanks,
fisico 30

To return to the OP and to answer this particular question. The reason that power supplies are usually sources of voltage with low source (internal) resistance is because they dissipate very little power internally over a massive range of supply currents. This makes them a very efficient source. Batteries tend to behave like this, inherently - at least, it's relatively easy to make a battery that will give a fairly steady voltage over a range of currents. The same thing goes for rotary dynamos and generators.
Consider this: if all electrical appliances / devices were designed to work from a particular current and power supplies were required to feed them with that current then HOW would you connect multiple devices to the same power supply? They would need to be connected in SERIES, for a start. You would need to specify that they all take the same amount of current - so the more you connected in series, the greater voltage you would need to supply and they would all have different voltage drop (depending upon the power of each one. Also - if you didn't want any power out of your supply, you would need to SHORT CIRCUIT it or the volts would go up and up and up - as the device 'tried' to deliver its nominal current. Removing an appliance from its socket, you would need to replace it with a shorting link, so the other devices could be powered. A proverbial nightmare and a truly upside down world.
The highest power devices would need to have the highest resistances (having the highest voltage) - compared with voltage driven devices where the highest power device has the lowest resistance (taking the highest current).
There are some components that need to be supplied with a particular current, of course, but they are exceptions and need specially designed circuitry.
 
  • #20
374
0
Hello Sophiecentaur

The reason that power supplies are usually sources of voltage with low source (internal) resistance is because they dissipate very little power internally over a massive range of supply currents.

Ok, I see that: the power sources does not consume too much power if the internal resistance is zero. That does not assure that the most power possible goes to the load (impedance matching theorem, correct?)


In the case of a battery, the voltage is rather constant and the current can change depending on the load resistance. That is why a battery is a constant voltage source and not a constant current source.

Electrical devices need electrical power to function. Power is simply energy per time.
Power is given if both a nonzero current and voltage are applied to the device. The current seems to be determined by powered device while the voltage by the power source....is that a correct general statement?

For instance, in the case of a single solar cell at max illumination and connected to a certain load, both the voltage and current coming out of the solar cell depend on the cell internal resistance and electronic device impedance, correct?

A solar cell, under ideal illumination, reaches a max open load voltage (like a battery has a fixed voltage). The current can increase depending on how much solar light is incident on the cell. The more the light, the higher the max drawable current can be.....correct?
But all depends on the load, correct?

thanks
fisico30
 
  • #21
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,964
5,269
Hello Sophiecentaur
The reason that power supplies are usually sources of voltage with low source (internal) resistance is because they dissipate very little power internally over a massive range of supply currents.
Ok, I see that: the power sources does not consume too much power if the internal resistance is zero. That does not assure that the most power possible goes to the load (impedance matching theorem, correct?)
In the case of a battery, the voltage is rather constant and the current can change depending on the load resistance. That is why a battery is a constant voltage source and not a constant current source.
Electrical devices need electrical power to function. Power is simply energy per time.
Power is given if both a nonzero current and voltage are applied to the device. The current seems to be determined by powered device while the voltage by the power source....is that a correct general statement?
For instance, in the case of a single solar cell at max illumination and connected to a certain load, both the voltage and current coming out of the solar cell depend on the cell internal resistance and electronic device impedance, correct?
A solar cell, under ideal illumination, reaches a max open load voltage (like a battery has a fixed voltage). The current can increase depending on how much solar light is incident on the cell. The more the light, the higher the max drawable current can be.....correct?
But all depends on the load, correct?
thanks
fisico30

Most of this is along the right lines BUTTTTTT the maximum power theorem definitely is not applied for batteries or the mains supply. If you did match load to source, you'd be wasting half the power you generated (melting batteries and cables). Ideally, the power source has a source impedance as near to Zero as possible.

Solar cells are interesting because they definitely don't have a low internal resistance and, to get the most energy our of them for any particular light flux, you would really need to be able to vary the load dynamically. I don't know if they use switch mode control, in practice but I'd bet that could be made to give the optimum power output.
 
  • #22
374
0
Hi Sophiecentaur,

"you'd be wasting half the power you generated".

I think the correct statement would be that we would be wasting half of the maximum amount of power we can generate.

"Solar cells are interesting because they definitely don't have a low internal resistance and, to get the most energy our of them for any particular light flux, you would really need to be able to vary the load dynamically."

Why do you say that the internal resistance is not low?
Why would we need to change the load dynamically? Because the internal resistance varies continuously?

I have a small device (http://store.solio.com/Solio-Store/Solio-Bolt-Solar-Charger-S620-AH1RW) that can change via usb any sort of device. What do you think is inside it in terms of electronics components and how do manage to change all these difference devices?

thanks,
fisico30
 
  • #23
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,964
5,269
Hi Sophiecentaur,

"you'd be wasting half the power you generated".

I think the correct statement would be that we would be wasting half of the maximum amount of power we can generate.

"Solar cells are interesting because they definitely don't have a low internal resistance and, to get the most energy our of them for any particular light flux, you would really need to be able to vary the load dynamically."

Why do you say that the internal resistance is not low?
Why would we need to change the load dynamically? Because the internal resistance varies continuously?

I have a small device (http://store.solio.com/Solio-Store/Solio-Bolt-Solar-Charger-S620-AH1RW) that can change via usb any sort of device. What do you think is inside it in terms of electronics components and how do manage to change all these difference devices?

thanks,
fisico30
No. When the load is matched, an equal amount of power is dissipated in the source as in the load. It doesn't relate to the power you 'could' generate - it relates to the actual power being delivered and dissipated at the time. For instance, the mains supply to your house will have a source resistance of, perhaps 1Ω. The protection fuse would have to blow long before you approached a load resistance of 1Ω.

Yes. It's a non linear device and the optimum value of load will depend upon the flux hitting it. Any device that will not supply much current cannot be said to have a low source resistance. Of course, if you're talking in terms of an array of several msquared, then the resistance is somewhat lower.

Your little solar charger will be based on 5V - the standard USB supply voltage- and will have a regulator to limit the voltage out to 5V. Nothing more, I'd bet. That 5W figure is a bit hopeful for that apparent cell area, I think. Have you ever measured the output?
It will rely on the individual devices being charged to make the best use of what it gives them - they will self protect against any overcharging.
 
  • #24
374
0
Well, the max power transfer theorem implies impedance matching. When we impedance match load and source we are surely splitting the power 50%, but that 50% is a ratio.

Example: no impedance matching, most of the power (99%) is given to the load but that is 99% of little total power. If total power is 10 W, the load gets 9 W.

Impedance matching: 50% to the load and 50% to the source. But now the total power is, maybe, 100 W. So the load gets 50% of 100W, which is 50 W. So in this situation is gets the most power it gets, even if it is a smaller percent of the total power.


No, I have measured the solar cell output.
Regardless of size, a typical silicon PV cell produces about 0.5 – 0.6 volt DC under open-circuit, no-load conditions. If a load with finite impedance is attached, the voltage goes down but the current goes up (current is zero in open-circuit configuration).
So, a specific load, based on its impedance, will receive its own particular voltage and current (i.e. power). The load may have a minimum power requirement to work properly. Hopefully the cell can give the right current and voltage to obtain that power.

5V? I did not know that was the typical USB port voltage. If the device needs more or less,
voltage converters apply I guess.

thanks
fisico30
 
  • #25
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,964
5,269
I think we're talking a bit at cross purposes about the implications of the maximum power theorem, Suffice to say that an efficient power supply system will not be aiming at a match. You want more power into your load - you use a bigger generator / battery that will still have a low enough source resistance. There is no advantage in any way to match the load - at the very least, you are doubling the cost of supplying the power to the customer - and you also need a way of getting rid of all that wasted energy at the power station; an additional massive cost.
It is not even common to match a source to a load in RF transmitting installations when high power is involved because this immediately reduces your efficiency and requires higher power devices (valves and transistors - ££££). Matching is more important at the antenna end of a feeder because you do not want reflections, which will cause echos and high standing wave voltages (but this is not relevant to supplying circuit power)

You need to pick a PV array with sufficient current delivering capacity to suit your load at its operating voltage. If you are really smart, you can possibly get around this problem (if the PV provides too many volts, perhaps) by DC-DC transformation with a switch mode circuit. This can avoid just wasting power in a dropper resistor. Usually, the cheaper solution is to use more PV area.

Look up the USB spec. It's all on line. The Phone manufacturer builds to that requirement. 5V was the chosen rail voltage way back when TTL technology was the rage. It has stuck for most standard applications.
 

Related Threads on Electronic devices operating voltage and current

Replies
3
Views
2K
Replies
16
Views
1K
  • Last Post
Replies
5
Views
7K
Replies
53
Views
4K
  • Last Post
Replies
1
Views
2K
Replies
7
Views
2K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
3
Views
844
  • Last Post
Replies
6
Views
2K
  • Last Post
Replies
1
Views
1K
Top