AdamF
- 34
- 3
Is this thread appropriate for some basic computer science hardware and architecture questions or are the posts in here to be strictly related to specific Programming Languages?
Mark44 said:If the question is computer-sciencey, this section is probably as good as any other.
BINGO!AdamF said:The way that I understand computer science at this point is that there are really two segments;
1 - How do I make the computer do what I want it to do?
2 - How would the computer be able to do what I want to make it do?
After you learn the syntax of some programming language, you essentially "explain" to the computer what you want it to do.AdamF said:1 - How do I make the computer do what I want it to do?
Now you're getting into the architecture of the computer -- its CPU (central processing unit), memory (registers, RAM, cache, and disk storage), I/O, and so on. Computer languages vary a lot in how much or how little they insulate you from the hardware itself. At the lowest levels are assembly languages, with different languages for different architectures. A bit higher level is C, and higher yet are C++, Java, C#, Python, and many others.AdamF said:2 - How would the computer be able to do what I want to make it do?
Mark44 said:After you learn the syntax of some programming language, you essentially "explain" to the computer what you want it to do.
Now you're getting into the architecture of the computer -- its CPU (central processing unit), memory (registers, RAM, cache, and disk storage), I/O, and so on. Computer languages vary a lot in how much or how little they insulate you from the hardware itself. At the lowest levels are assembly languages, with different languages for different architectures. A bit higher level is C, and higher yet are C++, Java, C#, Python, and many others.
This is somewhat confused. The term "register" refers to a limited number of named memory locations inside the CPU. The term "registry" is specific to Windows (I believe), and refers to a block of memory where lots of internal stuff is kept track of.AdamF said:Okay, so here's some of what I'm trying to understand at the moment;
Let's say you want to command the computer to "Move the contents of Registry address N to Registry address M" and the way that you can do this is by typing the command directly into the keyboard.
Well, no. It's only going to carry out the kinds of commands that it understands.AdamF said:You turn the computer on, it goes to a blank (black) screen with a cursor ready for you to type in your command. You type in the command above, and the computer obeys.
There are "commands" (normally called instructions) that can read the contents of memory locations, copy the contents between registers, copy the contents between registers and memory, read values from an input device or send values to an output device, and lots of other operations. All data on a computer is stored in binary form, strings 0s and 1s. This includes numbers, characters, images, whatever.AdamF said:Here's how I currently understand that what happens in that situation -- I'm hoping somebody can point out whether this is accurate or where it isn't accurate:
- The "contents" of Registry Address N are sitting there existing in a way that they are expressed as some kind of "state" of the computer hardware; this "state" is the related to the output of a series of circuits which use various logic gates to use binary encoding. (I'm not sure if "encoding" is the right word to describe what the binary is doing, but the circuits are basically creating the output of 0's and 1's which somehow encapsulate the information.)
Well, no. It's only going to carry out the kinds of commands that it understands.AdamF said:- Next, I type my command on the screen. This is where I'm a little less clear on what happens -- Did the people who made the computer literally wire the circuitry from the keyboard to the registry such that the combination of symbols in the command "Move the contents of Registry address N to Registry address M" would use the wiring to communicate with the registry electronically?
No, that's not at all how things work. Things are complicated enough that I can't explain all of what you're asking here in an internet forum. Take a look at any of the links that @jedishrfu provided, which should help with some of your misconceptions.AdamF said:In other words, the computer used the laws of Physics to rig the whole thing up this way from the beginning and then later just tacked on the things like the letters and numbers of the keyboard an the monitor to make it make sense to me as the user?
The computer maker basically had to take into account all of the things they thought it would be reasonable for me to expect to do, and then build the machine with circuits and tech in order for me to be able to do those things in a way that makes sense to me?
AdamF said:The place that I'm kind of getting stuck is where the interface is between the user commands and the actual technology really is and what it means for the information to be "encoded in 1's and 0's", but I'm starting to kind of get it.
Is this all kind of a way more complex version of how a tribe would "encode" information in smoke signals by getting together and deciding "Okay, three smoke rings means X and five smoke rings means Y" and then it fell back on the user to interpret the smoke when they saw it? (In this case, the "smoke signals" would be the output in the form of whatever appears on the GUI.)
It's not "trained" to understand commands -- it's designed to execute them. At the lowest levels, the commands it responds to are those to move data here and there, load a register with a value from memory, store the value in a register in memory, add or subtract two values, and a lot more.AdamF said:-- I guess I'm having the most trouble understanding first how the computer is trained to understand various commands, (and then from there how it executes various commands from my own head to my fingers/voice and then to the computer and then back to my eyes/ears, but I don't want to get ahead of myself...)
I think your idea is accurate, but I'm not sure that it's helpful. In addition to the hardware in a computer (corresponding to the beads, frame, and wires of an abacus), there is the software, which exists at several levels, from BIOS (basic input/output services) that lives in ROM chips, to the operating system, which also exists in multiple levels, to driver software, to user programs. This is an overly simplistic explanation, but I hope you get the idea.AdamF said:The hardware is the beads, the frame, the string, and the other physical components. (Okay, I'm confident this part is correct.)
The software would be (example) the "addition" program which uses the proper arithmetic algorithm, and the algorithm is performed using the hardware for the purpose of adding two numbers together, so that in this case the addition program/"software" for the Abacus actually lives in the user's head, and not inside the computer itself?
Mark44 said:It's not "trained" to understand commands -- it's designed to execute them. At the lowest levels, the commands it responds to are those to move data here and there, load a register with a value from memory, store the value in a register in memory, add or subtract two values, and a lot more.
jedishrfu said:In order to understand a computer at its most basic level, you really need to take a course or even several courses on how it works. The concepts are easy but it’s even easier to get fooled into believing it works in some other fashion.
There’s the cpu and it’s opcodes. The cpu is an incredibly complex chip composed of hundreds of thousands of gates. Each gate can represent different kinds of logic operations. One such example, is a flip flop which can be used to store a bit or toggle a bit.
Start with the Khan Academy and then start asking questions, don’t try theorize things until you’ve almost got it.
https://www.khanacademy.org/computi...my-and-codeorg-introducing-how-computers-work
The simplest computer is a Turing machine and there is an excellent working model in LEGO on YouTube if you search for it. Watch how it reads ops from it’s tape and then modifies it’s tape. The tape is the memory and the machinery around it is the cpu.
View attachment 240985
Here's a figure that represents the logic gates that might be involved in ANDing two bits.AdamF said:Data types and programming languages seem far more intuitive to me -- it's understanding how the commands interact on a physical level that's where most of my questions are at this point, it's probably because I need to learn about circuits and electricity.
A better analogy is a bank of light switches, with some up (on) and some down (off). A switch in the on position could indicate 1 and a switch in the off position could indicate 0. The computer's memory, these days, consists of billions of these switches.AdamF said:Is it accurate to say that when I hear that information is encoded in transistors and capacitors, this is kind of like how the gears of a mechanical watch encode the time, as in we design the machine using the laws of physics to give us back information in such a way that our human brains can make meaning out of the output from the machine?
Mark44 said:Here's a figure that represents the logic gates that might be involved in ANDing two bits.
View attachment 240986
The two bits come in on the lines marked a and b. A signal comes in on the line marked Operation, that controls which of the two operations to perform -- an AND of the two bits or an OR of the two bits. If the Operation signal is AND, and both bits are 1, the result is 1. If either or both bits are 0, the Result is 0. The figure also consists of logic gates to perform the OR of two bits.
A CPU consists of a large number of logic gates and controller hardware.
A better analogy is a bank of light switches, with some up (on) and some down (off). A switch in the on position could indicate 1 and a switch in the off position could indicate 0. The computer's memory, these days, consists of billions of these switches.
AdamF said:how does the computer take electricity and use some of that electricity to store numbers, some to store sound, some to store images, etc...?
It stores the 'voltage' that represents the result of the logic gates on capacitors. The main memory is composed of very tiny capacitors and a whole bunch of transistors. The transistors in main memory implement three different functions:AdamF said:For example, how does the computer take electricity and use some of that electricity to store numbers, some to store sound, some to store images, etc...?
Right, I phrased my question poorly, the computer translates all data sources to bits, and then spits them back out again in other forms (as per the design that the human programmer has in mind), as far as I understand. What I was meaning to ask was "How are various sources of information translated to one source which the machine understands and then back again"?PeterDonis said:These are all one question, because the computer does not store numbers, sound, or images; it just stores bits. Whether those bits encode numbers, sound, or images, depends on how those bits interact with other bits (the bits in the programs that process numbers or sound or images), which ultimately depends on the interpretation that human programmers wanted to put on particular bits. But they're all just bits to the computer, and it stores them all the same way.
AdamF said:I really don't understand what happens between the time that I type the thought in and the time that the text is reflected back to me on the screen.
AdamF said:Is this where things like the way that the keyboard is wired into the memory and the memory is wired into the computer monitor come in?
Well, that depends on how deep a scratch you want to dig.AdamF said:Okay, so to really learn about this to the point where I can understand why every single decision on a particular piece of architecture was made (or the entire machine for that matter) and exactly how I'd go about reconstructing each piece from scratch, I'll need to become extremely knowledgeable on the relevant aspects of Classical E-M and some Quantum Chemistry?
PeterDonis said:Ah, ok. See below.
Yes, but the "wiring" is a lot more complicated, at least in all modern computers (where "modern" here means "since about 1970 or so").
As far as "translating" things to and from bits is concerned, though, that happens very close to your typing and seeing things on your screen. Your keyboard generates a sequence of bits for each key you type and sends them to the computer; and your computer's monitor translates sequences of bits into what appears on your screen. Everything else in between is bits, and you can think about them abstractly without even having to know the details of how they are physically represented in things like transistors and capacitors and currents in wires. And it's virtually impossible for a single human mind to grasp what's going on in a modern computer without thinking about the bits abstractly; things at the level of transistors and capacitors and currents in wires are way, way too complicated to be able to comprehend at that level while also comprehending how all those things connect to the keystrokes you type and the things you see on your screen. You have to abstract away the actual physical hardware and focus on the bits (and indeed on things at even higher levels of abstraction than that) if you want to understand what's going on at the level of keystrokes and images on screens.
Tom.G said:Well, that depends on how deep a scratch you want to dig.
You don't really need to get Quantum deep if you can accept that:
A rough concept of a resistor and a diode would be useful if you get down to the transistor circuit level.
- a transistor conducts current proportional to its input <current or voltage> (depends on whether it is a bi-polar or field-effect transistor, usually field-effect these days)
- a capacitor can hold a charge (as evidenced that you can measure a voltage on it)
Cheers,
Tom
AdamF said:The specific sequence of bits which are generated by the keyboard are somehow related to the ASCII and Unicode stuff that I've been reading about in my Intro to Computer Science text?
AdamF said:The bits, which are represented by voltage and subsequent current through the wiring, somehow reach the monitor in such a way that utilizes the design of all of this to interact with materials which display light in the appropriate place when the electricity hit it, or something of this nature?
AdamF said:My objective is to reach the point where I can de-construct today's PC and understand exactly why every decision was made the way it was on the level of exactly what the creator was thinking when they made that decision, and then understand how to construct it myself if I needed to, from sourcing the appropriate materials, to being able to look at any line in the code of the Operating System and understand what it's doing, to understanding why the circuits are built the way they are, etc...
As @PeterDonis implied, this isn't a practical goal, as it would take more than a lifetime to obtain this knowledge. A better strategy, IMO, would be to focus on a much simpler processor (someone suggested an Intel 8080) and delve into things at either the hardware level (electrical engineering) or at the software level (computer or software engineering). For myself, I have only a little interest in what goes on at the level of transistors, but am much more interested in what happens in the CPU, from both assembly language and higher-level language perspectives.AdamF said:My objective is to reach the point where I can de-construct today's PC and understand exactly why every decision was made the way it was on the level of exactly what the creator was thinking when they made that decision, and then understand how to construct it myself if I needed to, from sourcing the appropriate materials, to being able to look at any line in the code of the Operating System and understand what it's doing, to understanding why the circuits are built the way they are, etc...
In which case, you're very far away from modern computers.AdamF said:Hence, my initial interest in models which do not even involve electricity.
Documentation for the C64 isn't missing. You can still find everything online ,Rive said:6502 is a good CPU, but I would rather recommend an available one instead - even if you get an old Commodore or like, it'll still have a few abstraction layers, but you will have to fight the missing documentation too...
In the short lived era of CRT based memories, it could be considered that the computer did store "images", and the CRT's scanning was used to interpret the "pixels" on the monitor screen as bits.PeterDonis said:These are all one question, because the computer does not store numbers, sound, or images; it just stores bits.
I'm not aware of any standard PC keyboard where the scan code equals the ASCII code. For example pushing down on the letter A on a keyboard produces a scan code of hex 1E or hex 1C (depending on scan code set), and doesn't correspond to hex 41 (ASCII upper case A) or hex 61 (ASCII lower case A).PeterDonis said:Keyboards generate scan codes--sequences of bits that tell the computer what keys were pressed and in what order. ASCII and Unicode are interpretations of sequences of bits as text. It so happens that, if you press an ordinary alphabetic key on the keyboard, the scan code (sequence of bits) that gets sent to your computer corresponds to the ASCII code (sequence of bits) for the letter that's on the key.
I don't know about the Commodore 64, but the Atari 8 bit family of computers: 400 / 800 / 65 XE / 130 XE are also 6502 based (they run at 2 mhz), and there are forums at web sites like AtariAge that have a lot of information on the Atari 8 bit systems and their peripherals, and access to toolsets (assemblers, basic interpreters, ...) . There are now Atari peripheral bus to USB interfaces that allow a PC to act as an Atari peripheral (usually as a floppy disk).Rive said:6502 is a good CPU, but I would rather recommend an available one instead - even if you get an old Commodore or like, it'll still have a few abstraction layers, but you will have to fight the missing documentation too.
rcgldr said:I'm not aware of any standard PC keyboard where the scan code equals the ASCII code.
jedishrfu said:Checkout the Turing Tumble toy at turingtumble.com. Its a marble based computer.
Each marble represents a digital pulse. The marble gates represent electronic ones which activate when a digital pulse is input.
The toy can do binary arithmetic as well as solve logic puzzles and play logic games.
This illustrates some aspects of how a cpu works.
jedishrfu said:In general things are very abstracted in computers nowadays. Keyboard scan codes identify the key that is pressed. A device driver maps it to the locale specific code point to be stored in an input buffer.
Some examples of locale specific code sets are ascii or extended ascii or unicode.
Later at display time, the locale code point selects a character from the specified font table and the font data is rendered on screen or printed.
Yes. There seems to be some research on optical computers, in which the signals would be sent by lasers, but there are difficulties converting between optical and electrical signals, so this type of computer seems to be down the road a few years.AdamF said:Is the main reason that electricity is used for the modern computer is because electricity moves so much faster?
I don't think so. The bottom area just collects the marbles that have gone through the various doodads. They don't "remember" what gates they have gone through. A better analogy I think would be a battery that powers a cellphone.AdamF said:By the way, in the Turing Tumble, the bottom area that collects the marbles would be analogous to a form of "memory", right?
This is a better analogy than some you've come up with. The switches in a house correspond pretty closely with the capacitors in a static RAM chip, a kind of memory that preserves its state even when the power is off, such as in solid-state external hard drives. That's in contrast to dynamic RAM, which loses its state when it's no longer powered.AdamF said:Okay, so the idea of the computer memory is kind of like the wiring in a house, where you can turn off the power to the house, but when you turn it on again, the electricity will flow to the areas that were on before because the circuits are still being completed there, regardless of whether or not they have power?
AdamF said:
All well and good, but if I wanted to know about computer architectures and such, I would want to work with some actual hardware.jedishrfu said:In the turing model the tape is the only memory with the machinery reading and writing values to the tape. The tape controls what the machine does.
It is the most fundanental of computers and is used in exploring many fundamental computer science theorems most notably the halting problem.
Mark44 said:All well and good, but if I wanted to know about computer architectures and such, I would want to work with some actual hardware.