Say I have two light bulbs, Bulb A is designed for a 120v system (a larger resistance built in) and bulb B is designed for a 3v flashlight (a smaller resistance built in). Assuming I can hook the bulbs up to a 120v series circuit without blowing out the bulbs, it seems counter intuitive that bulb A would use more energy, but is that accurate? Would bulb A therefore be brighter? I am interested in what happens in a series circuit and I understand that hooking the bulbs up in a parallel circuit would have a different result. Thanks
In series the current I through both bulbs will be the same, so the energy dissipated, I^{2}R, will be greater for the higher-resistance bulb. Which bulb is actually brighter will depend on the details of their construction, as these details control how much of the energy is dissipated as heat and how much as visible light. If you really want to understand this, try estimating reasonable resistances for each bulb (assume at 120v the larger bulb consumes a few tens of watts, at 3v the smaller one uses a few watts, and W=IV and Ohm's law will see you home) and calculate what happens in the series circuit under various voltages.
Replace the bulbs with basic resistors and see what happens. Let's say R1 is 1,000 ohms and R2 is 100 ohms and we have 110 volts. Total resistance in the circuit is 1100 ohms. Total Current: 110v/1100 ohms = 0.1 amps. Voltage drop for each resistor. R1: 0.1 amps x 1000 ohms = 100 v R2: 0.1 amps x 100 ohms = 10v Power dissipated. Total: 110v x 0.1 amps = 11 watts R1: 100v x 0.1 amps = 10 watts R2: 10v x 0.1 amps = 1 watt What if we just looked at the power dissipated by each resistor/bulb by itself? R1 Current: 110v/1000 ohms = 0.11 amps R1 Power: 110v x 0.11 amps = 12.1 watts R2 Current: 110v/100 ohms = 1.1 amps R2 Power: 110v x 1.1 amps = 121 watts Well that's quite a bit of difference! In this case R2, which has a much smaller resistance dissipates MUCH more power by itself than R1. This is because the reduced resistance leads to a much greater amount of charges moved by the applied voltage. These moving charges have to lose their energy somewhere, and in our ideal circuit the only place possible is in the resistor. In a real circuit some would be lost to the power source and the conductor as well. Now think about what power is. It is the work performed by the circuit over time. What work are we performing? We are moving charges. That's it. That's where all the work is going. Imagine that the circuit is a water circuit for a moment. If we had no resistance to water flow but we were able to keep applying pressure to pump the water, it would accelerate forever. Obviously this doesn't happen because we have resistance and we couldn't keep applying pressure to pump the water since our pumps can only pump so much. The same thing happens in an electric circuit. The charges cannot keep moving forever since there is resistance in the circuit and our power supply cannot supply infinite current. The work performed on them gives them energy and this energy must be lost somewhere. When you put R1 and R2 in series, the combined resistance of R1 and R2 drastically reduces the current overall. This leads to fewer charges moved and less power to dissipate. R1 dissipates more power than R2 because of its higher resistance. This is analogous to a water circuit where you have a tiny tiny section of pipe in the circuit. It is at this point that most of the power in the water would be lost, not the much larger pipe further down. Does all that make sense?