1. The problem statement, all variables and given/known data The light bulb is 60 watts at 230 volts, how much does the voltage need to drop for the lightbulb to be 40 watts'? 2. Relevant equations V=IA, P=IV, P=IR^2, V=IR 3. The attempt at a solution I tried 60w/230v=0,26A..... 40w=0,26A*XV<->x=154V so voltage drop is 230-154=76.. I know the current drops with the voltage but this is all I can think of. nevermind, I found out.... 40w=v^2/885ohms->> v^2=35400(ACCIDENTALLY DEVIDED) so V=188.