What a difficult question...
Power or Signal amplifiers amplify input energy. They have an input impedance matched to the source. The output is matched to the load impedance. Impedance is the ratio of voltage to current at that port.
There are situations where a voltage or current needs to be sensed, but the source must not be disturbed by the connection. It is then necessary to have very low input power, so the product of voltage and current must be low. The identification or naming of an amplifier usually comes down to the input variable being sensed.
Voltage amplifiers sense input voltage. They have high input impedance, with very low input current.
Current amplifiers sense input current. They have low input impedance, usually with very little change in input voltage.
The amplifier output can be a voltage, a current, or a signal, that will be some proportional function of the input variable. Voltage or current outputs reduce sensitivity to variation of load.
If the output is a voltage it will have a low output impedance, driving a higher impedance load.
If the output is a current it will have a high output impedance, driving a lower impedance load.
If the output signal is power it should have a matched output impedance.
The sign of the amplifier gain is critically important when the amplifier is used in a feedback loop. But when signal gain is specified in dB, the polarity of the amplification is unspecified and will be dependent on phase shift.
An amplifier will contain some device with gain. But the device does not have to increase the signal, it may just change impedance. A signal amplifier may have a power gain of +15dB, or maybe –15dB like a passive attenuator. This can lead to "double negative" confusion where –dB attenuation is equivalent to amplification.
fog37 said:
I think that a voltage amplifier has very high input impedance, high voltage gain but a low output current.
High voltage gain is not necessary. Output current is decided by the load.