Background: I learned electronic theory 10 years ago from the Army, and have since worked in the electronics industry as a technician, for about the past 9 years. I'm working towards a BSEET (my interest lies in application, not research) and am required to take a few algebra based physics classes. Below is a problem that I thought I could do in my sleep (classes start monday, found this while perusing online coursework), but it seems like I have forgotten more than I have realized. 1. The problem statement, all variables and given/known data Vt = 8 vdc It = 13.7 amps R1 = 7Ω R2 = unknown Find voltage drop across R1 and the value of R2. 2. Relevant equations V=IR, single-loop circuit rule 3. The attempt at a solution 1) Since current is constant in a series circuit, I THOUGHT I could find the voltage drop across R1 by multiplying the value of R1 by the current, which amounts to 95.9 volts. Impossible, since the source voltage is only 8 volts. Had this worked, I could have used the resulting voltage drop to calculate the resistance of R2, since the voltage drop across R2 would have been Vt - VR1. 2) I attempted to determine Rt by using the equation Vt = It(R1 + R2). In doing so, I get -6.4Ω, which is, again, impossible. I thought that in order to satisfy ohm's law, Rt has to be 0.58Ω, which makes the value of R1 false, since Rt = R1 + R2. Where am I going wrong? These numbers are random numbers based upon student's name and ID number. That being said, is it even possible to achieve 13.7 amps with a voltage source of 8 volts and total resistance of >7Ω? According to my memory, calculations and sim software, it's not. Even with a total resistance of 7 ohms, total current in the circuit is about 1.15 amps at +8 Vdc. We'd need to raise voltage to roughly 95.6 volts to achieve 13.7 amps through a 7Ω circuit, even more if we added a second resistor.