# Brightness of a bulb

1. Aug 26, 2014

### Vibhor

1. The problem statement, all variables and given/known data

Q. What will happen when

a) A bulb rated 240 V 100W is connected to a 120 V supply .
b) A bulb rated 120 V 100W is connected to a 240 V supply

2. Relevant equations

P = V2/R

3. The attempt at a solution

a) R = 2402/100 = 576Ω

Power dissipated when the bulb is put across a 120V supply = 1202/576 = 25W .

The bulb lights dimmer .

b) R = 1202/100 = 144Ω

Power dissipated when the bulb is put across a 240V supply = 2402/144 = 400W .

Since power dissipated is more than the rated power ,the bulb fuses .

Is this reasoning correct ?

Many thanks !

2. Aug 26, 2014

### ehild

It is correct. The powers are not exactly the same you calculated, as the resistance changes with temperature quite much, but the outcome is correct.

ehild

3. Aug 27, 2014

### Vibhor

Thank you very much :)

Could you explain how would we calculate the power in the two cases . Or what should have been the correct reasoning in this question ?

In between the answer given for the part a) is that the bulb fuses . The answer to part b) is not given .I think the answer key has got it wrong.

4. Aug 27, 2014

### ehild

You can not calculate it without knowing the relation between power rating and temperature, and resistance related to temperature. It would be too complicated. So all you can do is what you did. Your reasoning is correct.
For fun, just try to measure the resistance of the light bulb when it is cold. Use a simple multimeter, you certainly have at home or at school. You will see, that it is only a few ohms, while the resistance calculated from the nominal power rating and voltage is a few hundred ohms.

No, the bulb will not fuse if you connect it to a voltage less than minimal! Your answer was correct.

ehild

5. Aug 27, 2014

### phinds

In the real world, the answer to b) is that the bulb would burn out pretty much immediately but I assume the question mean an "ideal bulb" where power dissipation is not an issue.