Understanding Analog-to-Digital Conversion for Digital Computing and Processing

In summary, an analog circuit designer would need to know about Kirchoff's laws and basic DC circuit analysis in order to interface ADCs into a system.
  • #1
Elecomputer
8
0
Right now I am a computer engineer. I really love digital and want to focus on it because it uses logic.

Analog is a bit more complex in that signals are not only 1 or 0, but sampled with many values.

Anyways, I took digital logic and design courses, but I am unsure if i should take any analog. My concern is taking in real world signals (via sensors, for example). How much analog would I need to know to convert real signals into digital signals for digital computing/processing?

Sorry if my wording is weird.
Thanks in advanced!
 
Engineering news on Phys.org
  • #2
Real world digital electronics is in not way divorced from analogy circuitry techniques. You can design the LOGIC of digital circuitry without analog but if you actually want to make things work you need to understand what's going on when, for example, you get edge-spiking causing ringing problems that turn gates on and off in a non-digital-logic kind of way.

Also, A to D and D to A converters require knowledge of analog. I personally hate analog and love digital but I learned a lot of analog and found that it was all needed in real-world debugging situations.
 
  • #3
I figured the same. I am currently taking signals and circuits. But really find it too boring and question whether i should pay more for analog. And i agree, I am talking about A to D and D to A converters.

Not sure if you'd understand this, but would it be worth taking a second circuits course or go into electronic circuits. I want to at minimum be able to grab real signals and convert them to digital, and vice versa. Thats pretty much all I want from analog.
 
  • #4
Check out these two links and see if any of that stuff looks interesting. :)

http://www.faqs.org/docs/electric/

http://ecee.colorado.edu/~bart/book/book/contents.htm

Regards
 
  • #5
Elecomputer said:
I figured the same. I am currently taking signals and circuits. But really find it too boring and question whether i should pay more for analog. And i agree, I am talking about A to D and D to A converters.

Not sure if you'd understand this, but would it be worth taking a second circuits course or go into electronic circuits. I want to at minimum be able to grab real signals and convert them to digital, and vice versa. Thats pretty much all I want from analog.

It's been almost 50 years since I took EE so I don't really know what's in the various courses given these days and can't advise you on that.
 
  • #6
Elecomputer said:
My concern is taking in real world signals (via sensors, for example). How much analog would I need to know to convert real signals into digital signals for digital computing/processing?

Sorry if my wording is weird.
Thanks in advanced!

My only computer interfacing experience is in process control where 10 hz was quite high frequency.

Where I've seen computer folks get into trouble is almost always right at the interface, and by two mechanisms:
They fail to appreciate the concepts of "Common Mode Voltage" and grounding;
and the concept of "loading" the analog circuit they are measuring.
Latter is particularly troublesome when their A/D converter loses power so the analog circuits they tapped into get loaded down and possibly tied together through the multiplexer. Most muxes need power to maintain high input impedance. It's embarrassing when your computer makes the control room indicators go haywire.
Former is troublesome when the analog circuit exceeds the modest common mode capability of most solid state multiplexers, typically ~15 volts, and the computer cannot read them.

These are simple problems solved by Kirchoff's laws.
So you need to know basic DC circuit analysis and work lots of practice problems.
I presume your computer science curriculum teaches about anti- aliasing filters, so you'll need some AC circuit analysis as well.
These are both introductory level EE courses, or were in my day.

Doubtless there's another whole world of DSP, maybe someone from that community will offer more insight ..

old jim
 
  • #7
You really don't have to understand much analog to interface to ADCs and DACs and the like. Usually what you need to know is described in the datasheets. If you need to know more, Analog Devices has some stellar tutorial information in their free text on dataconversion. Check it out:
http://www.analog.com/library/analogdialogue/archives/39-06/data_conversion_handbook.html

Pretty much everything you need to know to successfully interface ADCs into your system is in there.

I'm a professional analog circuit design engineer and I highly recommend this book.
 
  • #9
@Elecomputer
You really need to appreciate that all so - called Digital Signals are ANALOG. The Digital stuff is the information they carry and the 'next stage' in any 'digital' circuit always needs to dig the digital information out of the analog signal it has been handed. We are lucky, these days, that so much digital circuitry allows us to be really sloppy about the analog nature of many designs - that is, until you want to stretch your processor speed or do real time DSP - or even just get your comms link to pay for itself!
Get into some serious Analog basics. It will repay you, in the future.
 
  • #10
If you are going to be designing computer circuit boards particularly at higher frequencies, you would do well to study transmission lines, especially with respect to impedance matching.
 

Related to Understanding Analog-to-Digital Conversion for Digital Computing and Processing

1. What is the difference between digital and analog?

Digital and analog are two types of signals or data representation. Digital signals are discrete, meaning they have a finite number of values, while analog signals are continuous and can have an infinite number of values.

2. Which is better, digital or analog?

The answer to this question depends on the specific application. Digital signals are more accurate and less susceptible to noise interference, while analog signals can have a higher resolution and more natural representation of real-world phenomena.

3. How does digital and analog affect audio and video quality?

Digital audio and video signals have a higher quality compared to analog signals because they are less prone to distortion and can be easily reproduced without loss of quality. However, some people prefer the warmer and more natural sound of analog audio.

4. Can digital and analog signals be converted to each other?

Yes, it is possible to convert digital signals to analog and vice versa through a process called modulation. Many devices, such as televisions and radios, have built-in converters to switch between digital and analog signals.

5. How are digital and analog used in modern technology?

Digital and analog signals are used in various technologies, such as computers, cell phones, and televisions. Digital signals are also used in data transmission, storage, and processing, while analog signals are commonly used in audio and video equipment.

Similar threads

Replies
2
Views
2K
  • Electrical Engineering
Replies
4
Views
859
Replies
13
Views
3K
  • Electrical Engineering
Replies
26
Views
6K
Replies
7
Views
3K
  • Electrical Engineering
Replies
8
Views
2K
  • Electrical Engineering
Replies
14
Views
4K
Replies
14
Views
2K
  • Computing and Technology
Replies
11
Views
3K
  • Electrical Engineering
Replies
4
Views
6K
Back
Top