What really makes the lamp comes on, tension, current or potency?

Click For Summary

Homework Help Overview

The discussion revolves around the factors that influence whether a lamp turns on, specifically focusing on the roles of voltage, current, and power dissipation. Participants explore the relationship between these electrical concepts in the context of a lamp rated at 4W and 1V.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • The original poster questions whether it is the current, voltage, or power that causes the lamp to light up. They express uncertainty about the effects of varying current levels on the lamp's operation. Other participants discuss the implications of applying different voltages and how it affects brightness and functionality.

Discussion Status

Participants are actively engaging with the concepts, with some providing clarifications on terminology and the physics involved. There is an exploration of how different voltage levels affect the lamp's brightness and operational state, but no consensus has been reached on the original poster's questions.

Contextual Notes

There is a noted confusion regarding terminology, particularly between "tension" and "potential difference," as well as "potency" and "power." Additionally, participants mention varying familiarity with different physics notations based on their geographical backgrounds.

jaumzaum
Messages
433
Reaction score
33
Sorry for the stupid question, but what actually makes the lamp comes on, the tension applied to it, the eletric current that passes through it or the potency dissipated by it?
I think its the current but I'm not sure and I don't know why?

But if it were the current, what makes the lamp burn, and what makes it not comes on?

I mean , I know that if it were the current, a big current will make it burn, but how much? A lamp of 4W and 1V for example, has a current of 4/1 = 4A, so 4,00001 A would burn it?
And 3,999999 A would make it not comes on?
 
Physics news on Phys.org
jaumzaum said:
Sorry for the stupid question, but what actually makes the lamp comes on, the tension applied to it, the eletric current that passes through it or the potency dissipated by it?
I think its the current but I'm not sure and I don't know why?

But if it were the current, what makes the lamp burn, and what makes it not comes on?

I mean , I know that if it were the current, a big current will make it burn, but how much? A lamp of 4W and 1V for example, has a current of 4/1 = 4A, so 4,00001 A would burn it?
And 3,999999 A would make it not comes on?

A lamp labelled "4W , 1V" is designed to be operated with a 1V potential difference applied to it, at which stage it will dissipate power at a rate of 4 Watts.

If you apply a slightly smaller Potential difference, it will dissipate slightly less that 4W - but it is not a proportional al arrangement [meaning 0.5 Volts does not automatically mean 2W]

If you apply slightly more than 1V, the lamp will try to dissipate slightly more that 4W.

Given "engineering design tolerances" for actual products in the real world, the lamp will "succeed" for a while as the voltage is increased. The lamp will be brighter than usual - the filament will be hotter than the designers planned. Increase the voltage gradually and eventually the filament will get too hot and melt - he lamp will burn out.

NOTE: not all that comfortable with your original terms Tension and potency.
What you called potency dissipated should have been power dissipated.
What you called Tension applied should have been Potential Difference applied - or at a pinch; Voltage applied.
 
Sorry, I am not too used with Amercian Physics notation (I'm brazilian by the way)

Thanks, for the answer
I still don't understand a thing:
If a little potential difference in applied to the lamp, in the example 0.15V, would it be less brighter th an usual or won't it light all?
 
jaumzaum said:
Sorry, I am not too used with Amercian Physics notation (I'm brazilian by the way)

Thanks, for the answer
I still don't understand a thing:
If a little potential difference in applied to the lamp, in the example 0.15V, would it be less brighter th an usual or won't it light all?

Actually I'm Australian, but,

If you apply less than the specified/design voltage it will be less bright. The limit for most of the globes I have used is about half - so if it is a 1V globe, you need at least 0.5V to notice anything.

The way a globe works is, the higher the applied Voltage, the hotter the filament gets.
With a very small voltage it might be only a little warm, and you wouldn't notice anything.
Increase the voltage and the filament will get "red hot", and appear red.
Increase further and the filament gets white hot and starts to glow like a lamp is supposed to.
Increase far enough, and the filament will get so hot, it melts.

So 0.15 V and your globe would probably NOT light up at all.

The specifications on the globe tell you what voltage to apply to have it in a good operating state - glowing nice and bright but not melting..

Note: I reverted to the term globe: when our "lamps" burn out, we get a new globe. We use the term light globe: A lamp is the device that has the light globe in it, as in "a bedside lamp may have a 40W globe fitted".
 
PeterO said:
Note: I reverted to the term globe: when our "lamps" burn out, we get a new globe. We use the term light globe: A lamp is the device that has the light globe in it, as in "a bedside lamp may have a 40W globe fitted".

Do you call it globe in Australia? I know it as light bulb.

ehild
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
Replies
13
Views
2K
  • · Replies 6 ·
Replies
6
Views
10K
  • · Replies 2 ·
Replies
2
Views
3K
Replies
12
Views
7K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K