Can increasing voltage or current make a bulb brighter?

  • Thread starter Thread starter waver
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around the relationship between voltage, current, and the brightness of light bulbs, particularly incandescent ones. Participants explore how varying voltage and current affects brightness, the implications of Ohm's law, and the potential risks of exceeding electrical ratings in appliances.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants suggest that increasing the voltage across an incandescent bulb generally increases its brightness by changing the current flowing through it.
  • Others clarify that according to Ohm's law, voltage and current are related through resistance, and that increasing current does not automatically mean an increase in voltage unless the resistance is constant.
  • A participant questions whether different combinations of voltage and current can yield the same power through a wire, leading to further discussion about the behavior of electrical devices under varying conditions.
  • Concerns are raised about the potential for overheating and damage to appliances if the voltage or current exceeds their rated limits, with analogies used to illustrate the concept of stress on materials.
  • Some participants emphasize that appliances are typically labeled with their electrical ratings, which should guide safe usage.
  • There is a discussion about the nature of charge flow and how increasing voltage is necessary to push more charge through a device, with one participant questioning if charge can be increased independently of voltage.
  • Responses indicate that resistance inherently limits the flow of charge, and thus more voltage is required to push more charge through a resistive element.

Areas of Agreement / Disagreement

Participants generally agree that voltage and current are interrelated, but there is no consensus on the nuances of how they affect brightness and the implications of exceeding electrical ratings. Multiple competing views remain regarding the specifics of these relationships.

Contextual Notes

Some statements depend on the assumptions about resistance and the characteristics of specific electrical devices. The discussion does not resolve the complexities of how different devices respond to changes in voltage and current.

waver
Messages
4
Reaction score
0
Hi,

I have developed a sudden interest in electricity. Never would I have thought all those physics I have learned in school, actually is so useful.

I am thinking of creating some custom made lights in my house. I took out my old physics book and started reading it. There is one thing I don't quite understand.

Does increasing the voltage or current make a bulb brighter? Does increasing the current automatically means an increase in voltage and vice versa?

Thank you.
 
Engineering news on Phys.org
Well, for an incandescent light bulb, resistance is a [temperature dependent] physical property of the filament and the voltage is a property of the source of electricity (though can be made to vay by using a dimmer). So the amperage will be dependent on the voltage and resistance via i=v/r.

Yes, you can generally increase/decrease the brightness of an incandescent bulb by varying the voltage you put across it, which changes the amperage pushed through it.
 
"Does increasing the current automatically means an increase in voltage and vice versa?"

Yes according to ohm's law V = IR or R = V/I
as long as you power source can supply enough power this will hold.
 
Thanks Russ and what for the quick reply.

So am I correct to say that (passing 1 volt of 1 amp) = (passing of 0.5volt of 2 amp) = (passing 2 volts of 0.5 amp) through a wire?

Sorry but is there an analogy on how passing too much current through an electrical device destroys it. And How do you know how much current/voltage you can pass in an appliance without destroying it, assuming it is not labled?

Thank you.
 
Look at the equation I posted and use that, waver. If you take a light bulb that passes 1amp at 1volt, it won't pass 2 amps at .5 volts, it'll pass .5amps at .5 volts.

All appliances are labeled, waver - it is required by law. Going much above the rated voltage isn't a good idea, but the reality is you generally can't for household devices anyway. Where are you going to find a 140 volt source to put a light bulb on?
 
Consider it this way: charge is 'freefalling' through your device due to the electric field within it - the applied voltage. There's a fixed rate at which charge will flow through the device at a fixed voltage (unless you have a deficiency of electrons, which given the nature of household power supplies isn't realistic). To get more charge to flow, you have to push it harder - a higher voltage.

Now this is not exclusive - some devices won't accept any more current whatever you do to them.

In general, overrating a device - like a lightbulb, will cause it to overheat. Defects in the metal (or whatever) bleed kinetic energy off the electrons which is converted into thermal energy. Stuffing more electrons down the wire liberates more heat, which causes the resistance of the wire to increase, which bleeds more energy off the electron flow, which generates more heat...You get the idea.
 
waver said:
Sorry but is there an analogy on how passing too much current through an electrical device destroys it. And How do you know how much current/voltage you can pass in an appliance without destroying it, assuming it is not labled?

You don't need one the actual problem is close enough to common sense as it is. Put too much through it and it overheats and catches fire. You could if you really wanted to think of it as bending a spoon back and forth at the neck. Do it slowly and it will let you no problems. Do it too fast and it will overheat and break. Apparently this is how Uri Geller does his spoon bending trick, he pre-heats/stresses them. Psychic or not he's an amazing magician!

I can't remember the exact ratings but in the United Kingdom at 240V we use 1.0mm cable for 5A circuits (lights). I think it will take 10A but we never load it that much and that's for single core too. Flex can take more as it dissipates heat better.
 
Thanks Russ, Sojourner and Adder_Noir. All your explanations have been very helpful.

Sojourner01 said:
Consider it this way: charge is 'freefalling' through your device due to the electric field within it - the applied voltage. There's a fixed rate at which charge will flow through the device at a fixed voltage (unless you have a deficiency of electrons, which given the nature of household power supplies isn't realistic). To get more charge to flow, you have to push it harder - a higher voltage.
.


Hi Sojourner01,

Sorry if this may sound stupid, but since this amount of charge is flowing at this certain amount of voltage, increasing the voltage will pass more charge through like what you said. But could we actually increase the amount of charge instead of increasing the voltage?
 
No - resistance is just what it sounds like: resistance. To push more charge requires more force and that force is voltage.
 
  • #10
Thanks Russ, I get it...Been of great help.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
Replies
30
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 6 ·
Replies
6
Views
1K
Replies
22
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K