Basic Hardware Architecture Questions

  • Thread starter AdamF
  • Start date
34
3
Ah, ok. See below.



Yes, but the "wiring" is a lot more complicated, at least in all modern computers (where "modern" here means "since about 1970 or so" :wink:).

As far as "translating" things to and from bits is concerned, though, that happens very close to your typing and seeing things on your screen. Your keyboard generates a sequence of bits for each key you type and sends them to the computer; and your computer's monitor translates sequences of bits into what appears on your screen. Everything else in between is bits, and you can think about them abstractly without even having to know the details of how they are physically represented in things like transistors and capacitors and currents in wires. And it's virtually impossible for a single human mind to grasp what's going on in a modern computer without thinking about the bits abstractly; things at the level of transistors and capacitors and currents in wires are way, way too complicated to be able to comprehend at that level while also comprehending how all those things connect to the keystrokes you type and the things you see on your screen. You have to abstract away the actual physical hardware and focus on the bits (and indeed on things at even higher levels of abstraction than that) if you want to understand what's going on at the level of keystrokes and images on screens.
Okay, this makes a bit more sense (no pun intended):
The specific sequence of bits which are generated by the keyboard are somehow related to the ASCII and Unicode stuff that I've been reading about in my Intro to Computer Science text? These are the specific standardization for each symbol or something, right?

The bits, which are represented by voltage and subsequent current through the wiring, somehow reach the monitor in such a way that utilizes the design of all of this to interact with materials which display light in the appropriate place when the electricity hit it, or something of this nature?
 
34
3
Well, that depends on how deep a scratch you want to dig. :oldwink:
You don't really need to get Quantum deep if you can accept that:
  • a transistor conducts current proportional to its input <current or voltage> (depends on whether it is a bi-polar or field-effect transistor, usually field-effect these days)
  • a capacitor can hold a charge (as evidenced that you can measure a voltage on it)
A rough concept of a resistor and a diode would be useful if you get down to the transistor circuit level.

Cheers,
Tom
Well, my definition of understanding something is really "What I cannot create, I do not understand."

My objective is to reach the point where I can de-construct today's PC and understand exactly why every decision was made the way it was on the level of exactly what the creator was thinking when they made that decision, and then understand how to construct it myself if I needed to, from sourcing the appropriate materials, to being able to look at any line in the code of the Operating System and understand what it's doing, to understanding why the circuits are built the way they are, etc...

I
 
24,311
5,990
The specific sequence of bits which are generated by the keyboard are somehow related to the ASCII and Unicode stuff that I've been reading about in my Intro to Computer Science text?
Not directly, no. Keyboards generate scan codes--sequences of bits that tell the computer what keys were pressed and in what order. ASCII and Unicode are interpretations of sequences of bits as text. It so happens that, if you press an ordinary alphabetic key on the keyboard, the scan code (sequence of bits) that gets sent to your computer corresponds to the ASCII code (sequence of bits) for the letter that's on the key. But there's nothing that requires that to be the case; it was just a convenient way to simplify keyboard processing programs in early computers where ASCII was the only kind of text that was going to be used (this was way before Unicode was even invented).

The bits, which are represented by voltage and subsequent current through the wiring, somehow reach the monitor in such a way that utilizes the design of all of this to interact with materials which display light in the appropriate place when the electricity hit it, or something of this nature?
Eventually, but not necessarily the same bits that get sent by the keyboard as you type, and not without a lot happening in between.

My objective is to reach the point where I can de-construct today's PC and understand exactly why every decision was made the way it was on the level of exactly what the creator was thinking when they made that decision, and then understand how to construct it myself if I needed to, from sourcing the appropriate materials, to being able to look at any line in the code of the Operating System and understand what it's doing, to understanding why the circuits are built the way they are, etc...
This will take the rest of your lifetime, and you will need to take very good care of yourself so that you live to be 150 or so. :wink:

However, a good starting point might be to start with a simpler PC and operating system than today's. For example, you could try to de-construct the original IBM PC running DOS. These will give you an overview of the architecture and some references to dig deeper:


 

Tom.G

Science Advisor
2,506
1,341
Or for an even easier start of computing, look at the Intel 8080 processor, circa 1974, I haven't looked recently but their used to be both on-line and downloadable simulators that showed the internal workings at the register level. (that's about two levels up from logic gates)

The 8080 was the basis of the original Altair 8800 computer from the company MITS. That came out in 1975 (yes, it predated the Apple), and since it was a kit, there was a wealth of detailed documentation. You may be able to find some of it on-line with a bit of digging.

In a PM, the OP asked the following question:

Thank you for your help.

I mean, am I right in assuming that this whole thing would be a hell of a lot easier for me to understand if I actually knew how voltage behaved and how charged moves in various environments, etc...?


Here is my response:

Probably. Trying to do that here is not practical though.

I did an on-line search and managed to find a free download of "Basic Radio", all six volumes! The whole series runs 800 pages but most of what you need is in the first half of the first volume, with about 30 pages about "CAPACITORS AND CAPACITANCE" in the second volume.


It's a big file, about 27MB, so be patient if you are on a slow connection.

Cheers,
Tom
 
34
3
Downloaded, ty.
 
10,334
3,866
Focusing on startup, the power is switched on so all chips get energized. A computer will have a boot program installed on a ROM. The boot program is preprogrammmed for the cpu and is retained even when the computer is off.

Once power is switched on the cpu initializes itself ie zeros out its registers and then fetches a memory address at a predetermined location in the ROM and begins the arduous journey of loading your os and writing to the screen or beeping at you when things go wrong like bad memory aka beep codes.

Heres a more detailed description
 
1,203
585
Quite a problem that 'computers' as they are now has ~ 5-10 abstraction layers (did not care to try counting them) between any useful work and the transistors down there. Without methodically building up your knowledge about specific layers you are easy to miss the important parts.

6502 is a good CPU, but I would rather recommend an available one instead - even if you get an old Commodore or like, it'll still have a few abstraction layers, but you will have to fight the missing documentation too...
 
32,345
4,130
My objective is to reach the point where I can de-construct today's PC and understand exactly why every decision was made the way it was on the level of exactly what the creator was thinking when they made that decision, and then understand how to construct it myself if I needed to, from sourcing the appropriate materials, to being able to look at any line in the code of the Operating System and understand what it's doing, to understanding why the circuits are built the way they are, etc...
As @PeterDonis implied, this isn't a practical goal, as it would take more than a lifetime to obtain this knowledge. A better strategy, IMO, would be to focus on a much simpler processor (someone suggested an Intel 8080) and delve into things at either the hardware level (electrical engineering) or at the software level (computer or software engineering). For myself, I have only a little interest in what goes on at the level of transistors, but am much more interested in what happens in the CPU, from both assembly language and higher-level language perspectives.
 
34
3
Hence, my initial interest in models which do not even involve electricity.
 
1,856
173
6502 is a good CPU, but I would rather recommend an available one instead - even if you get an old Commodore or like, it'll still have a few abstraction layers, but you will have to fight the missing documentation too...
Documentation for the C64 isn't missing. You can still find everything online ,
http://www.classiccmp.org/cini/pdf/Commodore/C64 Programmer's Reference Guide.pdf

Stackoverflow works as well. There is a very active community.

The commodore 64 is a very good machine for this, since it is relatively easy to do fun stuff with sprites/sound etc. by writing to the chips directly. You will need some experience with programming however, and you might not want to start with Commodore basic.
 

rcgldr

Homework Helper
8,547
461
These are all one question, because the computer does not store numbers, sound, or images; it just stores bits.
In the short lived era of CRT based memories, it could be considered that the computer did store "images", and the CRT's scanning was used to interpret the "pixels" on the monitor screen as bits.

An actual example of an imaging device would be the Tektronix 4010 vector graphics monitor. It literally stored the image produced by the vector commands until it was commanded to erase the entire screen. There are no bits on the screen, just permanent persistence phosphors painted by the vector controlled electronic beam.

https://en.wikipedia.org/wiki/Tektronix_4010

QR or bar codes are examples where an "image" can be printed on a piece of paper and later scanned to convert the image back into a string of bits.
 
Last edited:

rcgldr

Homework Helper
8,547
461
Keyboards generate scan codes--sequences of bits that tell the computer what keys were pressed and in what order. ASCII and Unicode are interpretations of sequences of bits as text. It so happens that, if you press an ordinary alphabetic key on the keyboard, the scan code (sequence of bits) that gets sent to your computer corresponds to the ASCII code (sequence of bits) for the letter that's on the key.
I'm not aware of any standard PC keyboard where the scan code equals the ASCII code. For example pushing down on the letter A on a keyboard produces a scan code of hex 1E or hex 1C (depending on scan code set), and doesn't correspond to hex 41 (ASCII upper case A) or hex 61 (ASCII lower case A).

https://en.wikipedia.org/wiki/Scancode


6502 is a good CPU, but I would rather recommend an available one instead - even if you get an old Commodore or like, it'll still have a few abstraction layers, but you will have to fight the missing documentation too.
I don't know about the Commodore 64, but the Atari 8 bit family of computers: 400 / 800 / 65 XE / 130 XE are also 6502 based (they run at 2 mhz), and there are forums at web sites like AtariAge that have a lot of information on the Atari 8 bit systems and their peripherals, and access to toolsets (assemblers, basic interpreters, ...) . There are now Atari peripheral bus to USB interfaces that allow a PC to act as an Atari peripheral (usually as a floppy disk).
 
Last edited:
24,311
5,990
I'm not aware of any standard PC keyboard where the scan code equals the ASCII code.
Hm, I must have been misremembering. Or perhaps I was mixing up PC scan codes with codes from earlier keyboards.
 
10,334
3,866
Checkout the Turing Tumble toy at turingtumble.com. Its a marble based computer.

Each marble represents a digital pulse. The marble gates represent electronic ones which activate when a digital pulse is input.

The toy can do binary arithmetic as well as solve logic puzzles and play logic games.

This illustrates some aspects of how a cpu works.
 
10,334
3,866
In general things are very abstracted in computers nowadays. Keyboard scan codes identify the key that is pressed. A device driver maps it to the locale specific code point to be stored in an input buffer.

Some examples of locale specific code sets are ascii or extended ascii or unicode.

Later at display time, the locale code point selects a character from the specified font table and the font data is rendered on screen or printed.
 
34
3
Checkout the Turing Tumble toy at turingtumble.com. Its a marble based computer.

Each marble represents a digital pulse. The marble gates represent electronic ones which activate when a digital pulse is input.

The toy can do binary arithmetic as well as solve logic puzzles and play logic games.

This illustrates some aspects of how a cpu works.
Oh, this is awesome, I'm going to pick one up, thanks so much!

Is the main reason that electricity is used for the modern computer is because electricity moves so much faster?
 
34
3
In general things are very abstracted in computers nowadays. Keyboard scan codes identify the key that is pressed. A device driver maps it to the locale specific code point to be stored in an input buffer.

Some examples of locale specific code sets are ascii or extended ascii or unicode.

Later at display time, the locale code point selects a character from the specified font table and the font data is rendered on screen or printed.
By the way, in the Turing Tumble, the bottom area that collects the marbles would be analogous to a form of "memory", right?
 
523
215
You might find this thread helpful. The link is to a post in it in which I included a link to a download for the Circuit Scramble Android game, which is a fun app for building understanding of the workings of logic gates.
 
34
3
Okay, so the idea of the computer memory is kind of like the wiring in a house, where you can turn off the power to the house, but when you turn it on again, the electricity will flow to the areas that were on before because the circuits are still being completed there, regardless of whether or not they have power?
 
32,345
4,130
Is the main reason that electricity is used for the modern computer is because electricity moves so much faster?
Yes. There seems to be some research on optical computers, in which the signals would be sent by lasers, but there are difficulties converting between optical and electrical signals, so this type of computer seems to be down the road a few years.
By the way, in the Turing Tumble, the bottom area that collects the marbles would be analogous to a form of "memory", right?
I don't think so. The bottom area just collects the marbles that have gone through the various doodads. They don't "remember" what gates they have gone through. A better analogy I think would be a battery that powers a cellphone.
Okay, so the idea of the computer memory is kind of like the wiring in a house, where you can turn off the power to the house, but when you turn it on again, the electricity will flow to the areas that were on before because the circuits are still being completed there, regardless of whether or not they have power?
This is a better analogy than some you've come up with. The switches in a house correspond pretty closely with the capacitors in a static RAM chip, a kind of memory that preserves its state even when the power is off, such as in solid-state external hard drives. That's in contrast to dynamic RAM, which loses its state when it's no longer powered.

I looked at the Turing Tumble web site. That might be fun for very young kids, but if you really want to learn about computers, my advice would be to get a real computer, such as an Arduino or Raspberry PI, or a kit with one of the CPUs already mentioned in this thread, and learn how to program it, preferably in C and later, in assembly.
 
Last edited:
10,334
3,866
In the turing model the tape is the only memory with the machinery reading and writing values to the tape. The tape controls what the machine does.

It is the most fundanental of computers and is used in exploring many fundamental computer science theorems most notably the halting problem.
 
32,345
4,130
In the turing model the tape is the only memory with the machinery reading and writing values to the tape. The tape controls what the machine does.

It is the most fundanental of computers and is used in exploring many fundamental computer science theorems most notably the halting problem.
All well and good, but if I wanted to know about computer architectures and such, I would want to work with some actual hardware.
 
10,334
3,866
All well and good, but if I wanted to know about computer architectures and such, I would want to work with some actual hardware.
This is true. However, I was responding to the OPs markup of the Turing photo where two memories were identified and i wanted to make it clear that only the tape is the memory.

I was thinking that a discussion using the 6800 cpu might help in this thread since it had memory mapped io which eliminates some complexity from the discussion.

But im still thinking about how to summarize it in a post even though we know there is so much more to how a computer actually works. It might make a good insight article though.
 

Want to reply to this thread?

"Basic Hardware Architecture Questions" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top