Understanding Lightbulbs: Solving the Confusing Voltage Question in Physics

  • Thread starter Thread starter Jarfi
  • Start date Start date
  • Tags Tags
    Confusing Voltage
Click For Summary

Homework Help Overview

The discussion revolves around understanding the voltage requirements for lightbulbs, specifically how the voltage drop affects the power output of a 60-watt bulb compared to a 40-watt bulb. The original poster expresses confusion regarding the relationship between voltage, current, and resistance, particularly in the context of a physics test question.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to calculate the voltage drop needed for the bulbs based on power ratings but struggles with the lack of current information. Some participants suggest using power relationships to find resistance and question the assumptions regarding efficiency and filament resistance.

Discussion Status

Participants are exploring various approaches to the problem, with some suggesting that assumptions about constant resistance and efficiency may not hold true. The original poster reflects on their calculations and expresses uncertainty about their reasoning, indicating a lack of consensus but a productive exploration of the topic.

Contextual Notes

The original poster notes that the problem lacks specific information about current and resistance, which complicates their calculations. There is a mention of the law V=AI and its implications for the problem at hand.

Jarfi
Messages
384
Reaction score
12
So I got this wrong in my physics test, the question goes: The original voltage is 230 v and the lightbulbs are marked so said voltage is assumed for them. How much does the voltage need to drop for a 60 watt bulb to start glowing like a 40 watt bulb.

I first tried to find the resistance/ohms of each bulb but couldn't since no current is given. I tried to find a current but realized that both the current and ohms could vary and give the same voltage, V=RI. So there is no fixed current or voltage to work with so for all I know this could be 1000 amps and 0,230 ohms... I ended up doing the only thing i could think off which was:

40w/60w*230v=153... 230-153=77 voltage drop.

My test was flawless apart from this and all the teacher wrote was "no" on my answer -_-

Can anybody explain lightbulbs and how this works to mee.. or ill start loosing sleep over this;)
 
Physics news on Phys.org
You should know the power relationships, use the one that relates power, resistance and voltage to find the resistance of each bulb. Go from there.
 
Theoretically, you should take into account the fact that efficiency (light power output/electrical power input) varies with electrical power. And that current = V/R but R varies with voltage (current) also. So the total picture is pretty complicated.

I'd say a few assumptions need to be made. Depending on how rigoreous your course is, the simplest assumption is constant filament resistance and constant efficiency.

You just took the ratio of voltages = ratio of watts. But if I drop the voltage in half, do I really drop the power in half? I don't think so. So I would look at those ratios again.
 
rude man said:
Theoretically, you should take into account the fact that efficiency (light power output/electrical power input) varies with electrical power. And that current = V/R but R varies with voltage (current) also. So the total picture is pretty complicated.

I'd say a few assumptions need to be made. Depending on how rigoreous your course is, the simplest assumption is constant filament resistance and constant efficiency.

You just took the ratio of voltages = ratio of watts. But if I drop the voltage in half, do I really drop the power in half? I don't think so. So I would look at those ratios again.

There is no account of efficiency or such. We are given the law V=AI, so i can simply use that to find the current: 60W/230V=0,26A... then x=230v/0,26=885ohms. Then I know the power has dropped to 40W so I find the voltage corresponding: 40W=0,26A*XV->x=154 volts, that is the volt drop is 230-154=76V

Then I think, oh ofc the current must've dropped with the voltage uh let's use an equation with only watts, voltage and resistance... I=V/ohm so P=V*V/ohm that is P=v^2/ohm so 40w=v^2/885ohms->> v^2=0,0452 so V=0,21? wtf? totally wrong and different results.

I mean this totally and definitely depends on current and resistance.. and neither is give, you could have high current and low resistance or opposite and have the same results, what am i doing wrong?

nevermind, I found out... lol 40w=v^2/885ohms->> v^2=35400(ACCIDENTALLY DEVIDED) so V=188. Case closed.
 
Last edited:

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
Replies
1
Views
2K
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 57 ·
2
Replies
57
Views
14K