Electrical Information on Items

• Newtons Apple

Newtons Apple

Hi everyone. I'm currently just beginning my studying for my HAM license, and had couple of questions...

1. when looking at different items, a battery, a lightbulb, my Nintendo DS, etc, the information about the electronic is different. For example, this battery I have here shows that it's 1.2volts, and 850 mAh. So, why does it show these two bits of information? Why not show watts? or just amps? (or mili-amps in this case is more practical) The same line of question for the lightbulb. the lightbulb has the watts printed on it, and it has the miiliAmps, but NOT mA hours, like the battery does. So the main question is, why do certain devices/items use certain information? what determines this?

2. My other question is.. how do I know if the information on the device/item is for it's input requirement to use the item, or it's output rating that it gives off? Is it one of those things where you just have to use logic, like a fan obviously isn't going to give off electrical current, so what's printed on it, MUST be what it requires to operate? What about for devices that do both, like chargers? I'm looking at a charger I have here, and it's "INPUT" is 4.6 v and 900mAmps. So is the term INPUT referring the amounts that it takes from the wall socket in order for it to charge the device? or is it the amount that it "INPUTS" to the device, from the wall? How do I know?

Thanks everyone in advance, hopefully these questions aren't too remedial or silly..

The battery has a chemistry that fixes the voltage at about 1.2 V. The chemical capacity of the battery is such that it could deliver 850mA for an hour, or 425mA for two hours, etc. After that the battery will be chemically exhausted and the voltage will drop. The energy content of a battery is important, it will not work forever.

The light globe converts electric energy to heat and light. The wattage is the rate it converts energy. The mA rating for the globe specifies the current needed to keep it at operating temperature. For a light globe the rate of energy conversion is important, the life is determined by accumulated damage to the filament. If you exceed the rated current it will fail sooner.

A charger will have a high voltage AC input from the wall socket. The output is likely to be 4.6V DC at up to 900mA.

1. when looking at different items, a battery, a lightbulb, my Nintendo DS, etc, the information about the electronic is different. For example, this battery I have here shows that it's 1.2volts, and 850 mAh. So, why does it show these two bits of information? Why not show watts? or just amps? (or mili-amps in this case is more practical).

mAH is a measure of the batteries capacity. As Baluncore said, it tells you how long the battery can deliver a current but that current is usually determined by the load not the battery.

There will be a maximum power and current that the battery can deliver but (with the exception of car starter batteries) it's rare to operate a battery under those conditions because battery life will be short or the battery might be degraded. That info can usually be found or derived from the data sheet.

The same line of question for the lightbulb. the lightbulb has the watts printed on it, and it has the miiliAmps, but NOT mA hours, like the battery does.

Watts is the unit of power. It is/was printed on a light bulbs for two reasons, to indicate how much light it produces but also to indicate how much heat it gives off. These days the packaging (eg for LED light bulbs) is marked in Lumens which is the appropriate unit for light output. The wattage information can still be useful.

mA is a measure of current. mAH is a measure of capacity. The former is more useful for a load/bulb. The latter is more useful for a battery.

So the main question is, why do certain devices/items use certain information? what determines this?

The information provided is usually either:

a) The most useful or most important limiting parameter.
b) Required to meet regulations.

2. My other question is.. how do I know if the information on the device/item is for it's input requirement to use the item, or it's output rating that it gives off? Is it one of those things where you just have to use logic...

Yes.

Usually on an input the voltage should never be exceeded. The current is the amount that the equipment might draw in the event of a fault. eg the Fuse rating.
Usually on an output the voltage is what the equipment will deliver. The current is the maximum that the load should draw from the equipment.

I'm looking at a charger I have here, and it's "INPUT" is 4.6 v and 900mAmps. So is the term INPUT referring the amounts that it takes from the wall socket in order for it to charge the device? or is it the amount that it "INPUTS" to the device, from the wall? How do I know?

Got a link to that device/

If the "input" on your charger is marked "4.6V" then the "input" should NOT be connected directly to a wall socket which is usually 110 or 230V depending on your country.

Thanks CWatters! Appreciate the break down. Just a few other things that came up...

1. I have pretty old wiring in my house, and my Air Conditioner is plugged into a wall outlet, on the same line that I have an APC battery backup system connected to. Whenever the AC turns on (it turns off and on cause of power saver mode) the voltage of the circuit drops from 120volts to a drastic 33 volts for like a second, then it levels off at 111-113 volts. It stays at 111-113v until the AC shuts down for a few. Then once it starts back up again, it dips to about 33volts at the moment of it's start up and then again levels off at 111-113. During these drops and sags, it triggers my battery back up to kick in every time which is obviously annoying, and I'm assuming it's creating a drain on my APC battery. Is there anything I can do about this? I believe the AC takes 15amps. And I believe the circuit is rated for 15amps...is this the cause?

2. So if the circuit is rated for 15 amps, and the AC is 15amps, shouldn't that blow a fuse automatically? But it doesn't, and I can have many other devices on the same circuit still on and drawing amps of their own, so how can they all fit on this circuit at the same time, if the AC seemingly takes up all 15 amps?

3. Is there any device or tool where I can plug into a wall socket and see how many max amps the circuit is, how many amps are being drawn currently, and the remaining amps?

Thanks again guys, I think this is the last series of questions!

1) I'm not an expert on AC systems but I've heard that many heat pumps (of which AC is a type) have a high initial start up current (aka start up surge or inrush). In other words when they start they briefly draw more than their rated current until they are up to speed. Pretty sure that's what is happening on yours. Some heat pumps are designed to "soft start" and either don't have such a high start up surge or it's greatly reduced.

2) I live in the UK. Here individual sockets are rated at 13A and circuits at perhaps 32A. I thought in the USA individual plugs/sockets were rated at 15A so 15A for the whole circuit sounds on the low side.

3) There isn't a one device that does all that. The maximum allowed current depends on things like the wire gauge, length of wire and how it's installed. The fuse (usually a breaker in the UK) is intended to protect the wire. eg it stops the wire overheating/burning in the event of a fault in an appliance that causes it to draw too much current. Some breakers are slow acting - they are designed not to trip in the event of brief overloads to avoid nuisance trips.

It does sound a bit like this circuit is overloaded. Perhaps worth getting a local electrician to take a look.

johnnyrev
I recommend an electrician for the wall work mentioned below.

Is the ground on your A/C plug bad or missing, or are you using an adapter to convert a three-prong plug to a two-prong wall socket? Some A/Cs were meant to be plugged into a lone special socket, I think they were 20 or 25 amps, with a special plug. Does your A/C have a different plug on it than what came from the manufacturer? WALL: If it is a grounded socket, do you have a power reading between the black wire (electricians call it the "hot") and ground? WALL: Do you have a three-hole socket with no ground? WALL: Is the socket wired correctly, as in, are the black and white wires switched? Is the socket a GFCI, which you can test and reset at the push of a button, to see if that helps? Is it remotely GFCI'd elsewhere by another socket in the house or by a GFCI breaker? Did your breaker almost trip but not quite? Do you have or have had chewing pets? Are you using an extension cord or power strip, especially an old one (they can't take surges forever)? What is the rating of your A/C? How old is it? What is its required starting power and your APS's together?

I've seen iterations of these problems in my own home, some due to my own desire to save pennies (cheap appliances) and some due to faulty GFCIs, even right after installation. Sigh. Once after a storm our ground broke off at the service entrance, and power flickered through the whole house when anything came on. We couldn't run the coffee maker and the fridge at the same time.

Based on the questions I read in some of your other posts on the Forums, I don't recommend you tackle the wall work yourself. Use good judgment.

//edits for clarity//

Last edited:
lol... in other words JohnnyRev "You're definitely not smart enough to tackle this yourself buddy!" :p

johnnyrev
lol... in other words JohnnyRev "You're definitely not smart enough to tackle this yourself buddy!" :p