Your task is to design a "dummy" indicator light for your car. The cheapest bulb available is a #47 incandescent lamp rated at 6.3V for 150 mA. The problem is, your car produces 13.5 V when it's on. Choose the nearest 10% resistor that will reduce the voltage across the bulb to 6.3V.
V = IR
The Attempt at a Solution
I modelled the bulb as a resistor:
6.3 V = .150 A * R
Which gives R = 42 [itex]\Omega[/itex].
From there, I plugged this value into a new circuit with a 13.5 V source instead of a 6.3 V source, and added the unknown resistor to the equation:
13.5 V = .150 A * R + 6.3 V
7.2 V = .150 A * R
R = 48 [itex]\Omega[/itex].
The nearest 10% resistor value is 47 [itex]\Omega[/itex], but this wouldn't reduce the voltage across the bulb to 6.3 V. The next closest 10% resistor is 56 [itex]\Omega[/itex]. It just seems that this value is too high ... Did I follow this process correctly? What would you choose?