Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

(Fundamentally) Understanding Circuit Boards

  1. Apr 1, 2015 #1
    So I am not an engineer (FYI); I am currently training as either a physicist or a theoretical mathematician I think (still a freshman). Nonetheless I find the fundamental workings of a computer incredibly complex and would like to get a taste of ground-up design. I was wondering if anyone knew of any books that approached design from a very "electron is going through this wire, which leads to this reaction, which hence allows current to flow through this wire" sort of framework. I'm also interested in how such a seemingly complicated arrangement (as the ones used in actual computers) is pragmatically even possible.

    My assumed understanding of computers is that they use materials that act as if-then clauses (semiconductors). These if-then clauses allow programers to type in inputs to create new if-then clauses (called a program). Therefore a computer is a combination of materials that is allowed to change the way it works depending on how the buttons that are part of it are pressed (although that wiring arrangement sounds very complicated).

    Thank you for your time in reading this.
     
  2. jcsd
  3. Apr 1, 2015 #2
    The Art of Electronics is a good place to start .
    Regularly updated it was first published 25? years ago.
     
  4. Apr 1, 2015 #3
    I found Albert Malvino’s Digital Computer Electronics (McGraw Hill) to be very informative. I suppose it is a bit dated (1983) but I don’t believe the fundamentals have changed all that much.
     
  5. Apr 1, 2015 #4

    analogdesign

    User Avatar
    Science Advisor

    I second The Art of Electronics. It is a great book for absolute beginners, especially with a physics background.

    The key point in understanding this subject is that the function of a computer is distinct from its implementation. Logic operations that can eventually be built up into software are not inherently tied to any kind of implementation. They are typically done using semiconductor switches, as you say, but originally they were done using vacuum tubes. You could also implement computers using mechanical switches and relays (this has been done, most notably by Shannon who applied Boolean Logic to computation) or pipes filled with various amounts of water or even pen and paper. The logical connections do not depend on any specific technologies.

    My point here is that it is most valuable to approach them independently, otherwise you will get needlessly confused. Your "if-then" clauses (more naturally called AND-OR-NOT) is the basis of any digital computer. You can then learn a bit about electronics if you want and see how these functions are implemented. People are typically more interested in one viewpoint than the other, which is why we have the loose distinction between computer engineers (focused on system functionality) and electrical engineering circuit specialists (focused on implementing specific functions).

    To my mind, the key insight is this: complicated arrangements that form computers are collections of VERY simple arrangements that form gates (just a few switches). It's not magic, but it is very interesting.
     
  6. Apr 1, 2015 #5
    Thanks for the input. I took a look at The Art of Electronics and it is very involved (in a good way). I also hope to take a look at Digital Computer Electronics by the time I am done (just haven't gotten there yet). I really appreciate the responses; this is definitely the direction I was looking for I think (haven't dug through it heavily yet). :)
     
  7. Apr 1, 2015 #6
    Just so I'm clear on this, the difference between system functionality and implementing specific functions is that (this is a little bit of a guess; I hope I didn't miss an important part of your text):

    i) System functionality is more macro. Instead of figuring out how each function of the computer works this subject will assume each part of the computer works correctly and then figure out how each part should work together. Otherwise it might look into the abstract of how one physical AND-OR-NOT clause might work rather than bother with each and every AND-OR-NOT clause the computer actually ends up using (its implementation). It may also focus heavily on abstract logic.

    ii) Function implementation is concerned with figuring out the specifics of how each function is carried out by the hardware of the computer, which involves a much more in-depth look at every AND-OR-NOT clause used by the computer.

    Your overall point is that you can think about them separately to avoid confusion.

    Just checking whether I got it right.

    Or you may just be distinguishing between abstract logic and hardware. Possibly if I focus on the hardware in its most simple form, I can then look at how it fulfills the purposes of abstract logic afterwards to avoid confusion.
     
    Last edited: Apr 1, 2015
  8. Apr 1, 2015 #7

    dlgoff

    User Avatar
    Science Advisor
    Gold Member

    It could work the other way too. If you are just interested in the logic, just consider the logic gates:

    gate.gif

    This image is from http://hyperphysics.phy-astr.gsu.edu/hbase/electronic/gate.html.

    But if you want to know how the gates work, then consider the circuits. e.g. the AND gate

    and4.gif

    This image is from http://hyperphysics.phy-astr.gsu.edu/hbase/electronic/and.html#c1.
     
  9. Apr 2, 2015 #8

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    Hmmmmm. Where would one start today?
    I'm old enough to have started on vacuum tubes. In high school we built individual logic circuits from discrete transistors, on palm sized circuit boards, then connected them together do more functions, Our text was the GE Transistor Manual, 1964.
    But nowadays you'd need a microscope to see a flip-flop, and i dont even know how teaching of the subject is approached..
    The books i started with are all obsolete now.

    Here's a fascinating site to peruse

    http://www.computerhistory.org/semiconductor/timeline.html

    oh my goodness look what somebody has made available !
    http://www.introni.it/pdf/GE - Transistor Manual 1964.pdf
    It starts at the level you want, electrons and doping. Logic circuits start on page 175.

    But you'll want a hard copy. See ebay or amazon.
     
  10. Apr 2, 2015 #9

    analogdesign

    User Avatar
    Science Advisor

    I think you got that right. A "computer" is an abstract thing that is primarily defined by its instruction set. How it is implemented is another subject entirely. Both fascinating but one might be more your speed at first. In my experience most people prefer to take a "software" approach and think of a computer logically and then investigate how it is really made later.

    This is a very nice illustration of my point. The figure digoff posted is a perfectly valid AND gate but AND gates haven't been implemented that way in practice for more than 30 years! It just goes to show that even though the underlying technology changes, the logical basis for it remains the same. If looking at this particular implementation is helpful for understanding, though, it is useful.

    In case you're interested, here is what a modern "practical" AND gate looks like:

    figure14.jpg

    This is from: https://courseware.ee.calpoly.edu/~dbraun/courses/ee307/F05/index.html

    The funny thing is, usually a circuit design engineer would "design" an AND gate like this:

    assign out = in1 & in2; // this synthesizes to an AND gate

    This is a snip of code in the Verilog Hardware Description Language. The vast majority of gate-level digital design (either in ASICs or FPGAs) is now done by writing hardware descriptions using Verilog or a similar language called VHDL. A program called a "Synthesis Tool" parses your Verilog code and creates a gate-level description of your circuit. A gate-level description means your design is now implemented in gates such as AND gates and Flip-Flops instead of computer code. Then another program called a "Place-and-Route Tool" uses already designed and characterized gates (called standard cells) to implement and hook up your circuit for you.

    The subject is still approached in a similar way. I was an EE undergrad in the mid-90s and we built flip-flops with cross-coupled NAND gates on the bench. We also simulated them in SPICE. That is still the way it is done, even if we are more removed from practice now.

    They have a physical museum as well in Mountain View (in the old HQ of Silicon Graphics) that is truly amazing. If you ever visit the Bay Area you really need to go. They have almost every major computer technology in history to see there, from old IBM tabulating machines, to the German Enigma, to the Apollo Guidance Computer, to a Cray I, to an original Apple I signed by Woz. It's a magical place and I've been a member for years.
     
    Last edited: Apr 2, 2015
  11. Apr 2, 2015 #10

    dlgoff

    User Avatar
    Science Advisor
    Gold Member

    Nice find. The text I had for Electronics I (An Introduction to Electronics by W.G. Oldham & S.E. Schwarz) has a lot of similar transistor and IC fabrication images. Here's a couple of scans from it.

    mytext1.jpg

    mytext2.jpg
     
  12. Apr 2, 2015 #11

    donpacino

    User Avatar
    Gold Member

    I started college in '09.
    in general the 'into to digital electronics path' was....

    electronics: understanding how electricity works (op-amps, ohms law, etc)
    digital logic (Boolean algebra, electrical aspects of gates, decoders, endocders, k-maps, flip-flops, sequential logic, intro to vhdl, etc)
    microprocessor systems: expanding on the digital logic class to using more complicated hardware systems and interfacing with software (microprocessors, ALUs, RAM, eeprom, ADC, DAC, assembly language, etc)
    vhdl/verliog: hardware description languages.

    It seems like the same basic process that jim hardy and analogdesign touched upon.
     
  13. Apr 7, 2015 #12
    I was hoping not to bump this but I do want to thank you for the info. There is a lot of stuff here for me to look through. Thanks :).
     
  14. Apr 11, 2015 #13

    meBigGuy

    User Avatar
    Gold Member

    I'm a little late to the game.

    The trek from beginning electronics to how and what a computer does is a long ways (covers a lot of ground, in a good way) if you really want to start from the basics.

    danpacino summarized it well:

    electronics: understanding how electricity works (op-amps, ohms law, etc)
    digital logic (Boolean algebra, electrical aspects of gates, decoders, endocders, k-maps, flip-flops, sequential logic, intro to vhdl, etc)
    microprocessor systems: expanding on the digital logic class to using more complicated hardware systems and interfacing with software (microprocessors, ALUs, RAM, eeprom, ADC, DAC, assembly language, etc)

    Understand electricity, understand logic, understand how logic can create a processor (basic CPU architecture) controlled by software, understand the peripherals a processor needs to become a computer.

    If you want to jump ahead, this is a basic CPU: These basic elements are at the root of any computer. Just add IO (input output capability) to the external world.
    CPU_block_diagram.png
    (nice picture from http://web.sfc.keio.ac.jp/~rdv/keio/sfc/teaching/architecture/architecture-2009/lec02.html)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: (Fundamentally) Understanding Circuit Boards
  1. Circuit boards (Replies: 5)

  2. Simple circuit board ? (Replies: 4)

  3. Circuit Board Problem (Replies: 24)

Loading...