- #1
SleepyFace-_-
- 4
- 0
Hi guys and gals, please forgive the lame-o title, first poster here! I recently picked up a pretty fierce interest in electronics and I've been doing plenty of reading and experimentation but I have a few questions that I'm hoping will land on generous ears.
What follows is a "brain dump" of what I have stored in my head, I would really appreciate any corrections!
So I believe I understand V = IR to a "workable" degree.
What I think I know:
-Value V is equal to the value of amperage in amps * resistance in ohms.
-Voltage is a delta in field across two points, the driver for electron flow
-Amperage is a "simple" count of electrons moving across any point (coloumbs/sec)
-Wattage is a bonafide count of work (joules/sec)
-Electron flow and the energy transmitted by electron flow are two inter-related, but very different things.
-Wattage can be calculated by multiplying volts and amps. Failing any voltage info, one can calculate wattage by replacing V with IR, meaning wattage is also IRI.
-Pure amps describes electron flow in qty, but without voltage, you don't know if its a "fast mover" with a "narrow pipe" or a "slow mover" with a "fat pipe".
Here's the lead up question:
-If a laptop power supply, for example, 19.5 V @ 3.4 Amps is plugged in but not connected to anything, a voltage reading across its leads would give me 19.5v, but there would be no amps. Aka, the delta is there, but there is no current.
If I were to plug in a 19.5v fan to this power supply, it would run. If it were to add a second fan in series, would the second fan run at all? Why or why not?
I would think that if there is no voltage left after the first fan, there would be no "power" to drive it. in my head (0V * ?A = 0 Watts no matter what.) In this case, would both fans share the voltage evenly? If not evenly, then what dictates the voltage split?
If I were to put them in parallel, then both fans would get 19.5 V but would have to share amperage, so they would run, given enough source amps.
Here's the real question:
Lets say I connected a 16.5 volt fan to the same power supply, it draws a total of 4.2A. The power supply provides 19.5 volts @ 3.4 amps. So I have a surplus of voltage and a surfeit of amps. What happens in this case?
From my actual experiments, I've had to put in lots of parallel resistors in series with the fan in order to not overdrive it, but yet, I don't understand the relationship...
I knew I had to drop those 3 extra volts, regardless of the lack of amps. I ended up doing this:
3V = 3.4A x R
R=.88 ohms
But then I thought, 3.4A can't be right, can it?
So then I went with watts to see it from another angle:
Power supply is 19.5V x 3.4A = 66 Watts
Fan is 16.5v x 4.2A = 69 Watts.
From resistance calculation : 3V x 3.4A = 10.2 Watts.
If my fan wants/can handle 69 W and the power source provides only 66, why would I would need to "burn off" about 10 Watts of power via the resistor, even though my fan wants more amperage than can be provided by the power supply?
If component "draw" is met on demand by my power supply, up to its max amperage, what happens when the supply doesn't have enough amperage?
It seems that even though W = VA, volts can't be turned into amps, or vice versa?
It feels like voltage is somehow "more important" than amperage? I have a feeling I'm missing something here. Am I crazy? Help!
Thanks!
What follows is a "brain dump" of what I have stored in my head, I would really appreciate any corrections!
So I believe I understand V = IR to a "workable" degree.
What I think I know:
-Value V is equal to the value of amperage in amps * resistance in ohms.
-Voltage is a delta in field across two points, the driver for electron flow
-Amperage is a "simple" count of electrons moving across any point (coloumbs/sec)
-Wattage is a bonafide count of work (joules/sec)
-Electron flow and the energy transmitted by electron flow are two inter-related, but very different things.
-Wattage can be calculated by multiplying volts and amps. Failing any voltage info, one can calculate wattage by replacing V with IR, meaning wattage is also IRI.
-Pure amps describes electron flow in qty, but without voltage, you don't know if its a "fast mover" with a "narrow pipe" or a "slow mover" with a "fat pipe".
Here's the lead up question:
-If a laptop power supply, for example, 19.5 V @ 3.4 Amps is plugged in but not connected to anything, a voltage reading across its leads would give me 19.5v, but there would be no amps. Aka, the delta is there, but there is no current.
If I were to plug in a 19.5v fan to this power supply, it would run. If it were to add a second fan in series, would the second fan run at all? Why or why not?
I would think that if there is no voltage left after the first fan, there would be no "power" to drive it. in my head (0V * ?A = 0 Watts no matter what.) In this case, would both fans share the voltage evenly? If not evenly, then what dictates the voltage split?
If I were to put them in parallel, then both fans would get 19.5 V but would have to share amperage, so they would run, given enough source amps.
Here's the real question:
Lets say I connected a 16.5 volt fan to the same power supply, it draws a total of 4.2A. The power supply provides 19.5 volts @ 3.4 amps. So I have a surplus of voltage and a surfeit of amps. What happens in this case?
From my actual experiments, I've had to put in lots of parallel resistors in series with the fan in order to not overdrive it, but yet, I don't understand the relationship...
I knew I had to drop those 3 extra volts, regardless of the lack of amps. I ended up doing this:
3V = 3.4A x R
R=.88 ohms
But then I thought, 3.4A can't be right, can it?
So then I went with watts to see it from another angle:
Power supply is 19.5V x 3.4A = 66 Watts
Fan is 16.5v x 4.2A = 69 Watts.
From resistance calculation : 3V x 3.4A = 10.2 Watts.
If my fan wants/can handle 69 W and the power source provides only 66, why would I would need to "burn off" about 10 Watts of power via the resistor, even though my fan wants more amperage than can be provided by the power supply?
If component "draw" is met on demand by my power supply, up to its max amperage, what happens when the supply doesn't have enough amperage?
It seems that even though W = VA, volts can't be turned into amps, or vice versa?
It feels like voltage is somehow "more important" than amperage? I have a feeling I'm missing something here. Am I crazy? Help!
Thanks!