Seniority of Wattage vs. Amperage

  • Thread starter Thread starter mearvk
  • Start date Start date
  • Tags Tags
    Amperage Wattage
AI Thread Summary
The discussion centers on the relationship between voltage, amperage, and wattage in light bulbs, specifically questioning if a 100-watt bulb can function normally when voltage is varied. It is clarified that a bulb draws current based on the voltage applied, and power is calculated as the product of voltage and current. Safety concerns are raised regarding the dangers of connecting a 120-volt bulb to a 240-volt supply, emphasizing the need for proper resistance adjustments and the risks of electrical shock and fire hazards. Participants stress the importance of understanding basic electrical concepts, such as series and parallel circuits, before experimenting with high-voltage systems. The thread concludes with a warning about the potential dangers of misapplying electrical principles.
mearvk
Messages
133
Reaction score
0
Was wondering if I kept the wattage for say a 100 watt bulb constant but varied the voltage would the light be able to continue functioning normally?

Also, assuming this is true does this hold for more complicated circuitry?

Thanks.
 
Engineering news on Phys.org
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.
 
Antiphon said:
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.

The bulb sees 120 volts. We know it would draw about .83 amps, right? 120x.83 = 100

The bulb sees 240 volts. Would it then draw half as many amps?

The bulb sees 60 volts. Would it then draw twice as many amps?

More generally, does varying the voltage affect the wattage of a light bulb?
 
Last edited:
mearvk said:
The bulb sees 120 volts. We know it would draw about 8.3 amps, right? 120x8.3 = 100

The bulb sees 240 volts. Would it then draw half as many amps?

The bulb sees 60 volts. Would it then draw twice as many amps?

More generally, does varying the voltage affect the wattage of a light bulb?

Are you familiar with Ohm's Law? If you increase the voltage across a resistor, what does that do to the current through the resistor?

I = V/R

P = V^2/R = I^2 * R
 
Antiphon said:
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.

Never anthropomorphise lightbulbs. They hate it when you do that. :-p
 
So if I wanted to keep 100 watts going to a bulb with 144 ohms of resistance and I doubled the voltage to 240 I'd need to add 432 ohms (576-144) worth of resistance to the circuit?

So in theory I could run a 120v 100 watt light bulb on a 240v line if this were added: http://goo.gl/vy2Ig

Thanks for fielding my questions guys.
 
Last edited by a moderator:
So if I wanted to keep 100 watts going to a bulb with 144 ohms of resistance and I doubled the voltage to 240 I'd need to add 432 ohms (576-144) worth of resistance to the circuit?

So in theory I could run a 120v 100 watt light bulb on a 240v line if this were added: http://goo.gl/vy2Ig

Don't, Don't and Don't again.

Apart form the fact that what you propose will not work, This is a serious safety issue. 240 volt mains is seriously more dangerous than 120 volt mains.

If you must run a 120 volt bulb from 240 then run two in series. This will work safely.
Do not connect them in parallel, all you will achieve is two blown bulbs.

Do you understand what series means?
 
Last edited by a moderator:
Well is my math at least right?

If so, why would it be dangerous aside from the extra voltage?

Could you explain simply the diff between series and parallel?
 
Instead of trying to tell experts (and there are quite a few more expert than I am) here how to do something, how about just explaining your goal ie what you want to achieve and asking for help.
 
  • #10
mearvk said:
Well is my math at least right?

If so, why would it be dangerous aside from the extra voltage?

Could you explain simply the diff between series and parallel?

You don't understand the difference between series and parallel circuits, and you are wanting to start off working with AC Mains circuits? That's not a good thing to do. Please learn the basics of electricity and electronics first, and then find a good local mentor who can help you safely learn about working with AC Mains circuits. The shock and fire hazards are very real when working with those kinds of voltages and that much available power.

Here is your starter on series and parallel circuits:

http://en.wikipedia.org/wiki/Series_and_parallel_circuits

.
 
  • #11
Berkeman you seem to have a short circuit between 'theoretical' and 'actual'. I can ask questions all day about 240v circuits without being silly enough to try to power a 100w light bulb off of it.
 
  • #12
mearvk said:
Berkeman you seem to have a short circuit between 'theoretical' and 'actual'. I can ask questions all day about 240v circuits without being silly enough to try to power a 100w light bulb off of it.

Two problems...

First, you have not been clear that you do not intend to try any of this. We are genuinely concerned for your safety and those around you.

Second, when you post stuff like that in the forums, other newbies can see it and think that they can do it safely. Not a good idea.

We take safety seriously here at the PF. This thread is closed.
 
Back
Top