Digital vs. Analogue Signal Definition

  • #1
Dear all

I am slightly confused over the definitions of digital and analogue signals.

Is a digital signal discrete in time AND voltage or just discrete in voltage. For example, could a square wave that is continuous in time, but has discrete voltage levels be considered a digital signal?

Similarly could a sampled analogue signal (before passing into ADC) be considered analogue since it's voltage levels are continuous but it's time is discrete?

I would really appreciate any clarification anyone can give!

Many thanks

Paul Harris
 

Answers and Replies

  • #2
fss
1,179
0
Is a digital signal discrete in time AND voltage or just discrete in voltage. For example, could a square wave that is continuous in time, but has discrete voltage levels be considered a digital signal?
Yes.

Similarly could a sampled analogue signal (before passing into ADC) be considered analogue since it's voltage levels are continuous but it's time is discrete?
...this doesn't make much sense. Pretty much any signal can be considered analog before being passed to an ADC... before it gets to the ADC it's not sampled.
 
  • #3
5,439
9
Good points fss!

One further difference that is often overlooked.

Taken by itself a square wave or pulse train is not a digital signal. It is worthless as such without meaning. You also need a coding system.

An analog signal, however, is complete in itself.
 
  • Like
Likes bridgedora
  • #4
Thanks for the reply.

Just to clarify, a signal can be considered digital if it's continuous in time but discrete in voltage?

Also I guess I should have explained, I meant after the sample and hold part of the ADC but before the actual conversion takes place.

Thanks!

Paul
 
  • #5
5,439
9
Just to clarify, a signal can be considered digital if it's continuous in time but discrete in voltage?
Yes that is true.

If my coding was that a 0 signal was represented by +5 volts and a 1 signal (ie binary) was represented by 0 volts then if I switched my output to a +5volt supply that would be a continuous digital zero signal and if switched to a continuous 0volt supply the digital output would be a continuous digital 1.

This is not as silly as it sounds as the coding is a very common scheme.
 
  • #6
fss
1,179
0
Just to clarify, a signal can be considered digital if it's continuous in time but discrete in voltage?
Any signal can be considered digital depending on your definition of "digital." There are several digital encoding schemes that use discrete voltages that are not 0 and +5 V. Look up 2B1Q encoding.

Also I guess I should have explained, I meant after the sample and hold part of the ADC but before the actual conversion takes place.
Now I'm even more confused. Any signal is analog in the sense that it can be represented as a sum of sine and cosine terms. Sampling in itself is an act of "digitizing" in that it breaks up a signal into voltages meaningful to a computer or to another piece of electronics that doesn't have the "resolution" necessary to efficiently process a true analog signal.
 

Related Threads on Digital vs. Analogue Signal Definition

Replies
7
Views
14K
  • Last Post
Replies
1
Views
3K
  • Last Post
Replies
9
Views
538
  • Last Post
Replies
11
Views
4K
  • Last Post
Replies
7
Views
3K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
3
Views
10K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
2
Views
3K
Top