If a transistor circuit is used to amplify a 2.5 mA signal to at least
0.1 A, what must be the minimum gain of the transistor circuit?
b. Assume the amplifier consists of a number of transistors in series,
each with a gain of 10. How many transistors are needed in this circuit? (Hint: Two such transistors in series would provide a total gain
of 10 × 10 = 100.)
gain current = output current/imput current
The Attempt at a Solution
for the first question i got that the current gain should be 40,000 by dividing .1 by 2.5*10^-6. i am not sure if this is correct, but if it is then how is it possible to get to 40,000 by multypling tens?(for question b)