Can Ohm's Law be applied to all types of devices and power sources?

  • Thread starter Thread starter jeff davis
  • Start date Start date
  • Tags Tags
    Wires
Click For Summary

Discussion Overview

The discussion revolves around the applicability of Ohm's Law to various devices and power sources, particularly in the context of current draw, voltage, and resistance. Participants explore scenarios where devices may draw more current than expected and the implications of using undersized wires or incorrect voltage levels.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants express confusion about how devices can draw more current than expected, particularly in relation to Ohm's Law.
  • It is suggested that using a smaller wire increases resistance, which could lead to the wire burning out before the connected device.
  • One participant proposes that if a constant power source is used, decreasing voltage will increase current, potentially causing device failure.
  • Another participant notes that devices may draw more current if supplied with a higher voltage than designed for or if there is a short circuit fault.
  • Concerns are raised about the fire hazards associated with undersized wires and their impact on voltage drop at appliances under load.
  • Some participants clarify that while resistive loads follow Ohm's Law, motors and other devices may behave differently under low voltage conditions, potentially drawing excessive current.

Areas of Agreement / Disagreement

Participants generally agree on the basic principles of Ohm's Law and the effects of resistance and voltage on current draw. However, there is disagreement regarding the behavior of different types of devices (e.g., motors vs. resistive loads) under varying voltage conditions, indicating that the discussion remains unresolved.

Contextual Notes

Participants highlight that Ohm's Law applies under specific conditions, such as constant temperature, and that not all devices behave according to Ohm's Law, particularly when considering factors like back EMF in motors or the temperature-dependent resistance of filament lamps.

jeff davis
Messages
55
Reaction score
13
Hello,
I am pondering a topic and i can't quite find the answer that i am looking for. It concerns amp draw. I understand ohms law and the relationships it entails. I am confused with the idea that people say a device will draw more amps than it is supposed to and cause things to burn up. I understand the idea of putting too much power to something and burning it up, but i am just making sure that my thought pattern is correct for the contex of "drawing too much current"
if you have x amount of voltage and x amount of resistance, then you can only draw a certain amount of current, correct? Just because you have a smaller wire (which will increase the resistance) does not mean that you are pulling any more amps or volts, rather you are decreasing them right? So the wire will burn up, and not the device it is connected to?
That was pretty mush moshed because i am having a hard time explaining what i want to know.

I guess i could ask just directly:
1.) can you burn up a device by using too small of a power cord? (other than a motor)
2.) When considering power (P=IV) how can you possibly put less voltage onto something and it draw way more current? Like if you had something that was rated 120v 400w, would it burn up if you only hooked 2v to it? the current would then be 200amps instead of 10/3?? This idea goes against my understanding of ohms.
 
Last edited:
Engineering news on Phys.org
I am going to take a crack at your questions but hopefully someone else can give a better answer or even correct mine.

1) Using a smaller wire will increase the resistance of that wire (modeling the wire as a simple resistor) as you have said. This will cause the wire to burn out before the device it is connected to.

2) If you are using a constant unregulated power source (meaning P is constant) then by the equation you have provided (P=IV), if you decrease the voltage, the current will increase. This may cause the device to burn.
 
Last edited:
Hi there Jeff

jeff davis said:
Hello,
I am pondering a topic and i can't quite find the answer that i am looking for. It concerns amp draw. I understand ohms law and the relationships it entails. I am confused with the idea that people say a device will draw more amps than it is supposed to and cause things to burn up. I understand the idea of putting too much power to something and burning it up, but i am just making sure that my thought pattern is correct for the contex of "drawing too much current"

there are 2 main reasons why a device will draw more current than it's supposed to

1) you supply a higher voltage that what it was designed for = more current draw - Ohms Law
2) if there is a low resistance / short circuit fault in the device - this will result in a higher current being drawn, even tho the device is receiving the correct voltage
if you have x amount of voltage and x amount of resistance, then you can only draw a certain amount of current, correct?

Correct
Just because you have a smaller wire (which will increase the resistance) does not mean that you are pulling any more amps or volts, rather you are decreasing them right? So the wire will burn up, and not the device it is connected to?

correct, you will get to a point where the wire cannot carry the current required by the device and it will burn up ( fuse)

This is how a fuse protects something - its diameter/makeup is designed to carry a certain current


I guess i could ask just directly:
1.) can you burn up a device by using too small of a power cord? (other than a motor)

NO see above comments
2.) When considering power (P=IV) how can you possibly put less voltage onto something and it draw way more current? Like if you had something that was rated 120v 400w, would it burn up if you only hooked 2v to it? the current would then be 200amps instead of 10/3?? This idea goes against my understanding of ohms.

No, not for purely resistive loads --- a light globe, a heater element etc, but motors are an exception. when the voltage is lowered on the motor, under load, it will draw more current, heat up and burn out the wire windings
This is because motors present a much more complex load to the power supplycheers
Dave
 
Last edited:
  • Like
Likes   Reactions: 1 person
Undersized wires cause a voltage drop at the appliance under load. You could design a circuit that would burn up if the input voltage gets too low. One would hope that engineers do the opposite and consider what will happen to their design if the voltage drops.

Undersized wires are a problem unto themselves. They don't need any reason to be avoided other than the fact that they are a fire hazard.
 
Undersized wires cause a voltage drop at the appliance under load.

Yes that is correct, particularly where high currents are being used as with my transceiver radio gear where I'm hauling >20 amps at 13.8V.
BUT the voltage drop will not cause damage to the transmitter, it will just mean the transmitter cannot produce full power out.

As a result, I use a suitably sized conductor diameter that still is easily flexible and to keep it as short as practical

The DC power lead that comes with the radio is usually ~ 6 - 8 ft long. Most of us guys cut that down to ~ 4 ft and this can solve a up to 2V voltage drop during transmitDave
 
Thanks guys for your help. The idea is a lot more clear now. I suppose that when something is labeled 120v 400w, that is more a ratio (and a rating i am sure) than anything right? It means that when you put 120v on it you get 400w of work? So if i connected 2v to it instead, and if it even worked at all, i would only get 20/3w?

Thanks again!
 
jeff davis said:
Thanks guys for your help. The idea is a lot more clear now. I suppose that when something is labeled 120v 400w, that is more a ratio (and a rating i am sure) than anything right? It means that when you put 120v on it you get 400w of work? So if i connected 2v to it instead, and if it even worked at all, i would only get 20/3w?

Thanks again!

This is only correct if the device happens to be a heater (i.e. resistive and not operating at high temperature). If a motor is supplied with very low volts, it will not manage to turn at all and will, therefore produce no back emf. It will consequently take much more current than the simple formula would suggest. Likewise, for a filament lamp, the low temperature resistance may be 1/10 of the normal resistance when it is glowing. There are other devices that will take virtually no current with a low voltage supply (very high resistance).
Basically, only resistive devices follow Ohm's Law and, you should remember that Ohm's Law states the behaviour at constant temperature. Ohm's Law is, strictly, not the 'definition' of Resistance; it simply states that Resistance is constant at constant temperature (subtle difference there!).
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
Replies
6
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
8K
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K