Hello everyone, I have been working on my extended essay (IB) in physics these past few months, and my topic is the current surge that occurs when a light bulb is first turned on, due to temperature. I have finished all the math/theoretical part for this, and long story short, as I expected, the graph (current*time) is supposed to be hyperbolic decrease which reaches an equilibrium. Anyways, once I went started my experiments, using a dc power supply, light bulb + socket and voltmeter, this effect that I predicted only happened with certain odd conditions. When I had the circuit complete and then stuck the power supply's cable into the wall outlet, then I recorded the current surge which then settled at equilibrium (using vernier, fyi). But when the power supply was already plugged in to the outlet and I completed the circuit, the current just jumped to a stable value without a surge. I find this rather strange. Does anyone have an explanation for this, because I certainly don't see the reason why this would happen? I tried fiddling with connections and all, and waited long enough for the bulbs to cool. Is it possible that there is a power surge only when the dc power supply is turned on, but shouldn't a light bulb always have a current surge since it isn't ohmic? Which condition applies more to a regular light bulb connected to a building's current supply? Thank you for your help.