- #1
- 7
- 0
Hello,
This is my first post; so first of all, let me thank all of you for starting and running such a good forum that provides lot of help and insight into various engineering problems.
As to my brief introduction, I am an IT professional with educational background in (Mechanical) Engineering.
I have this practical problem about Electrical Engineering regarding an amateur project I'm doing @ home.
The question is:
I have an electrical power input AC signal of ~80 V and 0.8 mA.
So the electrical power available is = 80V x 0.0008A = 0.064W.
Ignoring conversion/transformation losses for now, is there a way to convert this into electrical output that has - say - 8V and 8 mA?
I can connect more power sources in series to collect more power, but essentially I need to strengthen the signal current (of course at the cost of a proportionate voltage drop). So I'm looking at current amplification.
Based on my reading, it looks like a (step-down) transformer should do the job of reducing the output voltage and proportionately increasing the output current ( V[in]/V[out] = I[out]/I[in] = a (turns ratio: N[in]/N[out]).
Q1. Is this understanding correct?
Q2. If yes,
Q2.1 What should be the specification of the transformer? Are there any available off-the-shelf in marker that I can use here?
Q2.2 Also, eventually I need to convert the signal to DC. So what would be the right thing to do? Should I rectify the voltage first and then use a DC transformer (does one exist)? Or I need to rectify the output of the transformer?
Q3. If transformer is not the solution to my problem, then what is the solution? I also read about some OpAmp and other amplifier IC chips for current amplification. Is that also a viable option for me?
I'm discounting the possibility of connecting my power sources in parallel in order to add current, because that would require a large number of power sources (in parallel). Please let me know if that's incorrect too.
Please advice.
Thanks.
Shaktidaas.
This is my first post; so first of all, let me thank all of you for starting and running such a good forum that provides lot of help and insight into various engineering problems.
As to my brief introduction, I am an IT professional with educational background in (Mechanical) Engineering.
I have this practical problem about Electrical Engineering regarding an amateur project I'm doing @ home.
The question is:
I have an electrical power input AC signal of ~80 V and 0.8 mA.
So the electrical power available is = 80V x 0.0008A = 0.064W.
Ignoring conversion/transformation losses for now, is there a way to convert this into electrical output that has - say - 8V and 8 mA?
I can connect more power sources in series to collect more power, but essentially I need to strengthen the signal current (of course at the cost of a proportionate voltage drop). So I'm looking at current amplification.
Based on my reading, it looks like a (step-down) transformer should do the job of reducing the output voltage and proportionately increasing the output current ( V[in]/V[out] = I[out]/I[in] = a (turns ratio: N[in]/N[out]).
Q1. Is this understanding correct?
Q2. If yes,
Q2.1 What should be the specification of the transformer? Are there any available off-the-shelf in marker that I can use here?
Q2.2 Also, eventually I need to convert the signal to DC. So what would be the right thing to do? Should I rectify the voltage first and then use a DC transformer (does one exist)? Or I need to rectify the output of the transformer?
Q3. If transformer is not the solution to my problem, then what is the solution? I also read about some OpAmp and other amplifier IC chips for current amplification. Is that also a viable option for me?
I'm discounting the possibility of connecting my power sources in parallel in order to add current, because that would require a large number of power sources (in parallel). Please let me know if that's incorrect too.
Please advice.
Thanks.
Shaktidaas.