formulajoe
- 177
- 0
Okay, I've got a differential amplifier biased by a constant current source at the source of each mosfet. There is a resistor R connected to the drain of each transistor and those resistors are also connected to a +5 DC. The second stage is a common source amplifer. The drain of transistor 2 feeds into the gate of transistor 3 which makes up the CS amplifier. The CS amplifer has a resistor Rd connected to the drain and +5 DC. The bias current for the differential amplifier is know to be 0.5 mA, so the drain current for each transistor is .25 mA. The drain current for the CS amplifier is .25 mA. vo2 is the drain voltage of transistor 2, which is connected to the gate of transistor 3. I'm supposed to design the circuit such that vo2 is 2 V and the output voltage of transistor 3 is 3 V. Each input to the different amplifier is 0V. The problem I'm having is that the output voltage, vo2, is supposed to be 2 V. How can that be when the input voltage is zero? If vo2 was a DC voltage, it would make perfect sense. But vo2 is not a DC voltage. I'm completely stuck.