Can increasing voltage or current make a bulb brighter?

  • Thread starter Thread starter waver
  • Start date Start date
AI Thread Summary
Increasing the voltage supplied to an incandescent bulb can make it brighter by increasing the current flowing through it, as per Ohm's Law (V = IR). The relationship between voltage, current, and resistance means that adjusting one affects the others, but increasing current does not automatically increase voltage unless the power source allows it. Appliances are typically labeled with their voltage and current ratings, which helps prevent damage from overloading. Excessive current can cause overheating and potentially lead to device failure, similar to bending a spoon too quickly causing it to break. Understanding these principles is crucial for safely creating custom lighting solutions.
waver
Messages
4
Reaction score
0
Hi,

I have developed a sudden interest in electricity. Never would I have thought all those physics I have learned in school, actually is so useful.

I am thinking of creating some custom made lights in my house. I took out my old physics book and started reading it. There is one thing I don't quite understand.

Does increasing the voltage or current make a bulb brighter? Does increasing the current automatically means an increase in voltage and vice versa?

Thank you.
 
Engineering news on Phys.org
Well, for an incandescent light bulb, resistance is a [temperature dependent] physical property of the filament and the voltage is a property of the source of electricity (though can be made to vay by using a dimmer). So the amperage will be dependent on the voltage and resistance via i=v/r.

Yes, you can generally increase/decrease the brightness of an incandescent bulb by varying the voltage you put across it, which changes the amperage pushed through it.
 
"Does increasing the current automatically means an increase in voltage and vice versa?"

Yes according to ohm's law V = IR or R = V/I
as long as you power source can supply enough power this will hold.
 
Thanks Russ and waht for the quick reply.

So am I correct to say that (passing 1 volt of 1 amp) = (passing of 0.5volt of 2 amp) = (passing 2 volts of 0.5 amp) through a wire?

Sorry but is there an analogy on how passing too much current through an electrical device destroys it. And How do you know how much current/voltage you can pass in an appliance without destroying it, assuming it is not labled?

Thank you.
 
Look at the equation I posted and use that, waver. If you take a light bulb that passes 1amp at 1volt, it won't pass 2 amps at .5 volts, it'll pass .5amps at .5 volts.

All appliances are labeled, waver - it is required by law. Going much above the rated voltage isn't a good idea, but the reality is you generally can't for household devices anyway. Where are you going to find a 140 volt source to put a light bulb on?
 
Consider it this way: charge is 'freefalling' through your device due to the electric field within it - the applied voltage. There's a fixed rate at which charge will flow through the device at a fixed voltage (unless you have a deficiency of electrons, which given the nature of household power supplies isn't realistic). To get more charge to flow, you have to push it harder - a higher voltage.

Now this is not exclusive - some devices won't accept any more current whatever you do to them.

In general, overrating a device - like a lightbulb, will cause it to overheat. Defects in the metal (or whatever) bleed kinetic energy off the electrons which is converted into thermal energy. Stuffing more electrons down the wire liberates more heat, which causes the resistance of the wire to increase, which bleeds more energy off the electron flow, which generates more heat...You get the idea.
 
waver said:
Sorry but is there an analogy on how passing too much current through an electrical device destroys it. And How do you know how much current/voltage you can pass in an appliance without destroying it, assuming it is not labled?

You don't need one the actual problem is close enough to common sense as it is. Put too much through it and it overheats and catches fire. You could if you really wanted to think of it as bending a spoon back and forth at the neck. Do it slowly and it will let you no problems. Do it too fast and it will overheat and break. Apparently this is how Uri Geller does his spoon bending trick, he pre-heats/stresses them. Psychic or not he's an amazing magician!

I can't remember the exact ratings but in the United Kingdom at 240V we use 1.0mm cable for 5A circuits (lights). I think it will take 10A but we never load it that much and that's for single core too. Flex can take more as it dissipates heat better.
 
Thanks Russ, Sojourner and Adder_Noir. All your explanations have been very helpful.

Sojourner01 said:
Consider it this way: charge is 'freefalling' through your device due to the electric field within it - the applied voltage. There's a fixed rate at which charge will flow through the device at a fixed voltage (unless you have a deficiency of electrons, which given the nature of household power supplies isn't realistic). To get more charge to flow, you have to push it harder - a higher voltage.
.


Hi Sojourner01,

Sorry if this may sound stupid, but since this amount of charge is flowing at this certain amount of voltage, increasing the voltage will pass more charge through like what you said. But could we actually increase the amount of charge instead of increasing the voltage?
 
No - resistance is just what it sounds like: resistance. To push more charge requires more force and that force is voltage.
 
  • #10
Thanks Russ, I get it...Been of great help.
 

Similar threads

Back
Top