I recently learned that ohms law is not always obeyed. I found this surprising because I thought ohms were defined in terms of volt and amperes by the equation R=V/I. I did a little googling and found the following definition... This clarified things a little in that the definition of an ohm specifies exactly 1 ampere, 1 volt, and 1 ohm. So if I have a wire with one ohm of resistance I know that if I apply 1 volt of potential, 1 ampere of current will flow. If however I apply 10 volts more current will flow, the wire will heat up, and its resistance will increase, so less then 10 amperes will flow. My question is, if other conditions (such as temperature) are artificially kept constant as voltage and current vary, does ohms law hold more true or absolutely true? Consider a spark plug. If I try to measure its resistance using ohms law while applying 100,000 volts will I get the same answer as if I ionize the air between the electrodes with another heat source and measure the resistance with 1 volt?