Difference between a voltage and a current signal?

In summary, current signals are robust to disturbances and can be specified by an added amount of current, while voltage signals are more precise and can be measured with a voltmeter.
  • #1
fog37
1,568
108
Hello Forum,

I have heard about voltage signals and current signals in the context of electronic circuits. What is the difference?

Is a voltage signal what is measured by a voltmeter between two points/wires while a current signal runs on a single conductor and is measurable by an ammeter?

Thanks,
fog37
 
Engineering news on Phys.org
  • #2
Both require two wires, and source and a return, so to speak.

A voltage signal would be a switched voltage source, for example, 5V and 0V to represent ON and OFF. The detector would be a high impedance load, like a cmos logic gate.
A current signal would be a switched current source, for example 5ma and 0ma to represent ON and OFF. The detector would be a low impedance load, like an opto isolator or transistor
(do a google search for "current loop" to see some examples).
 
  • Like
Likes davenn
  • #3
Fog37 - Your description is close enough, although for a current to flow you do need a completed circuit eg a return path to the source, so more than one wire is usually required.
 
  • #4
I recall long long ago working on a system that could output a 300 volt peak to peak signal. It was mostly direct coupled transistor stages balanced (differential mode) the whole way through and at one point one of the transistors was configured in common base mode. Its emitter was directly coupled to the collector of the previous stage. I would troubleshoot this thing and signal trace with a scope and all of a sudden the signal was gone. We often described this as: "You can't see it, now it's a current. Go to the output of the common base stage and you will get the voltage signal back."
 
  • #5
The practical diffeference is of course, that the current signal is transmitted as a current level and vica versa.
So the current signal is induced by a current source, that has an infinit high impedance. Thus the current signal is very robust to disturbances along the transmission line, such as electric fields and magnetic fields in the environment, that will induce voltage, but not current due to the high impdance of the source.

Another practical thing is that current signals often are specified as e.g. 4mA . . 20mA, corresponding to a measured temperature 0C° . . 100C°.
That's because the device, measuring the temperature, may consume 4mA for its own internal purpose ( converting temperature to current ).
So within the specified temperature range, the device consumes an extra amount of current = 0mA . . 16mA.

In this way the device is power supplied through the signal conductors.

That's clever, and rubust to electric noisy environment. ( See attached ).
 

Attachments

  • scan0021.pdf
    130.5 KB · Views: 325
  • #6
Hesch said:
That's because the device, measuring the temperature, may consume 4mA for its own internal purpose ( converting temperature to current ).

and there's added benefit of 4 to 20 ma (10 to 50 is also common)
in that an open circuit, ie detector has failed or somebody left it unconnected, announces itself as a reading well below zero.

vx252.jpg
 

What is the difference between a voltage and a current signal?

A voltage signal is a measure of the electrical potential difference between two points, while a current signal is a measure of the flow of electric charge through a circuit. In simpler terms, voltage can be thought of as the force pushing the electricity, while current is the amount of electricity actually flowing.

How are voltage and current related?

Voltage and current are directly related through Ohm's Law, which states that the current through a conductor between two points is directly proportional to the voltage across the two points. This means that as voltage increases, current also increases, and vice versa.

Can a voltage signal and a current signal be converted into one another?

Yes, voltage and current signals can be converted into one another using a device called a transducer. For example, a voltage-to-current transducer can convert a voltage signal into a corresponding current signal, and vice versa.

Which type of signal is more commonly used in electronic devices?

Voltage signals are more commonly used in electronic devices because they are easier to manipulate and control. However, current signals are used in certain applications, such as in sensors and transducers, where precise measurements of the flow of electric charge are necessary.

What is the difference between AC and DC signals in terms of voltage and current?

AC (alternating current) signals have constantly changing voltage and current values, while DC (direct current) signals have a constant voltage and current values. AC signals are commonly used in power transmission, while DC signals are used in electronic devices and batteries.

Similar threads

Replies
15
Views
1K
Replies
2
Views
1K
  • Electrical Engineering
Replies
16
Views
2K
  • Electrical Engineering
Replies
20
Views
663
Replies
6
Views
1K
Replies
47
Views
3K
Replies
38
Views
3K
  • Electrical Engineering
Replies
4
Views
328
  • Electrical Engineering
Replies
6
Views
951
  • Electrical Engineering
Replies
25
Views
2K
Back
Top