Power supply voltage homework

In summary, two 1000W toasters with a 120V AC circuit would require a DC power supply voltage of 240V for normal operation when connected in series. The resistance for each toaster would be 28.8 Ohms, and the units used are watts, volts, and amperes.
  • #1
Bradracer18
204
0
Need a little help here...getting started. I don't really understand what to use here.


If two 1000W toasters were to be connected in series to a dc power supply, what power supply voltage would be necessary for normal operation? Assume each toaster was designated to a 120V ac circuit.

A. 110V
B. 120V
C. 170V
D. 240V

I'm guessing it is above 120V...so C or D. Just because I know that dc power takes more voltage to get 120V to ac.

Thanks,
Brad
 
Physics news on Phys.org
  • #2
Find the resistance of the toasters, and solve for the DC voltage necessary to have each dissipate 1000W.

- Warren
 
  • #3
Ok...so here is what I did.

2000W/240V = 25/3...using P=IV

2000W/(25/3)^2 = 28.8Ohm...using P=I^2R

Now, using R = V/I...28.8Ohm * (25/3) = 240 V

Is that right?? If so...then my answer would be D.
 
  • #4
I agree with your answer. I would have calculated the resistance for each toaster independently, though.

Also, I strongly suggest using units more often. Intermediate answers like "25/3," without units, make your work hard for anyone to follow, including yourself.

- Warren
 
  • #5
Warren...I do use units a lot, but am unfamilar with these electric units...I didn't know what my units were on the 25/3...was it still ohms?
 
  • #6
Power / voltage = current (in amperes). Ohm's law, which you quoted as "P = IV," has power in watts, potential difference in volts, and current in amperes.

- Warren
 

1. What is a power supply voltage?

A power supply voltage is a measure of the electrical potential difference between two points in a circuit. It is typically expressed in volts (V) and is used to power electronic devices.

2. How do I calculate the power supply voltage for a circuit?

The power supply voltage can be calculated using Ohm's law, which states that voltage equals current multiplied by resistance (V = I x R). Alternatively, the voltage can be determined by measuring it with a voltmeter.

3. Why is it important to use the correct power supply voltage?

Using the correct power supply voltage is important because it ensures that electronic devices operate safely and efficiently. Using a voltage that is too high can damage the device, while using a voltage that is too low can cause it to malfunction.

4. How do I know what power supply voltage to use for a specific device?

The recommended power supply voltage for a device can usually be found on the device itself or in its instruction manual. It is important to use the recommended voltage to ensure proper operation and avoid damage.

5. Can I change the power supply voltage for a device?

In some cases, it is possible to adjust the power supply voltage for a device. This is often done using a voltage regulator or by using a power supply with adjustable output. However, it is important to make sure the device is compatible with the new voltage before making any changes.

Similar threads

  • Introductory Physics Homework Help
Replies
3
Views
1K
  • Electrical Engineering
Replies
8
Views
1K
Replies
19
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Introductory Physics Homework Help
Replies
2
Views
1K
  • Electrical Engineering
Replies
32
Views
872
  • Electrical Engineering
Replies
1
Views
269
  • Introductory Physics Homework Help
Replies
4
Views
1K
  • Electrical Engineering
Replies
11
Views
2K
  • Introductory Physics Homework Help
Replies
6
Views
15K
Back
Top