1. The problem statement, all variables and given/known data Design a diode voltage regulator to supply 1.5 V to a 150 Ohm load. Use two diodes specified to have a 0.7 V drop at a current of 10 mA. The diodes are to be connected to a +5V battery and a resistor R. Specify the value for R. What is the diode current with the load connected? 3. The attempt at a solution I'm assuming that we determine R with merely the voltage source, resistor R, and the 2 diodes in series. Thus, R=(+5V-1.4V)/10mA=360 Ohms However, this puts the output voltage at 1.4 V Thus, after this point, we attach the load resistor in parallel to the network and the voltage across it will be 1.5V as desired. The load resistor will draw a current I=1.5V/150=10mA, thus the current through the diode is decreased by this much and thus I=0 when attached to the load. Does that seem right?