I wish to design (if not actually build) an ultra-simple "computer" from scratch, just for fun and profit. I'm looking at only a fundamental level of computing power, with only one instruction, such as: "Given two single-digit numbers, output the product." I need three to design 3 things: 1] The algorithm for multiplication - likely in binary. 2] The assembly language that controls how the instructions are executed (store this value here, jump to that storage space, read its value, etc.) 3] The parts themselves: the memory, the input buffer, the output buffer, the logic gate(s). I'm sure I can figure out 1] myself. (Take digit 1, start at rightmost placeholder, logically AND it with itself, proceed to next placeholder, repeat, decrement loop, etc.) What I'd like help with is 2] and 3]. More specifically, how can I find or write the assembly language "programming" that will make this happen? And what physical components are required that I've left out? Registers? Cache?
Hi Dave, Sounds like a fun project. I did that project back in Undergrad -- it was a whole quarter-long class lab in an intro to digital design and computer architecture class. Strictly speaking, in -2- you are referring to machine language, not assembly language. To use assembly language, you would need to write your own assembler to generate the machine code. That would be fun too, so maybe you can grow this project into that after you get the machine language part of it running. Here is a pretty useful website with lots of info about the different levels of computer design: http://dept-info.labri.u-bordeaux.fr/~strandh/Teaching/AMP/Common/Strandh-Tutorial/Dir.html Heck, when I did that lab back in the late '70s, we built the computer out of discrete logic. But now, you could design it in Verilog and implement it on a Xilinx demo board. Doing that would actually teach you even more skills!
It'd actually be pretty educational to develop it not only with K-maps and 74' series discrete logic, but also with a modern Verilog flow and a PLA or FPGA. Each approach shows you the same design from an entirely different perspective. A good undegraduate book on computer architecture and logic design would be indispensable. It's a fairly broad topic, so we couldn't document all the steps you'd need to take here, but we can certainly help you if you get stuck. - Warren
I built a simple ALU in a course i took a while ago. It supported basic operations such as add, subtract, logical AND/OR/NOT, etc on 4 bit numbers. It was alot of fun. The machine language kind of comes naturally since your ALU must be able to identify which operation you want it to do. So you end up encoding each operation. Say Add is 001. Then to add two numbers you would pass the numbers AAAA and BBBB plus the operation onto the ALU, so 001AAAABBBB would be an instruction.
Um. I studied computing and used to compute on punch cards, and I have no idea what you guys are talking about, what with all these acronyms. :yuck: Nonetheless, I don't want to mislead you. I'm not looking to build an electronic computer, I'm looking to build one with ... alternate materials. But regardless of the materials, I'll still need the equivalent of all these components. "Say Add is 001. Then to add two numbers you would pass the numbers AAAA and BBBB plus the operation onto the ALU, so 001AAAABBBB would be an instruction." Right. This is what I'm thinkin'. So I need to figure out the nuts and bolts of waht it means to have and instruction and the data it opeartes on. Well, that's going to be stored in more memory registers. And something somewhere i.e. the central processor - "reads" that instruction and executes it in yet more memory registers.
Here's a question that will "bracket" the scope of the project for me: All interactions and spaces in a computer are some form of binary unit, either on or off, even the logic gates and processor. As a ballpark figure, how many on/off units would you say - in total i.e. including storing instructions and processor logic, tables etc. - are we lookin' at to make an operable device?
Well, a working 4-bit computer really only needs two 4-bit registers, plus enough memory to store whatever program you wish it to run. A single instruction will minimally be a 4-bit opcode and one 4-bit operand. If you wanted to store a maximum of 10 instructions, you might be able to get away with a total of 88 "flip-flops." Adding two numbers would involve three total instructions: MOV R1, 50 ; move the number 50 into register R1 MOV R2, 05 ; move the number 05 into register R2 ADD R1, R2 ; add R1 and R2, leaving the result (55) in R1 You might also be able to get away with not using any registers at all, and only having one type of memory. Your logic gets much more complicated, though, since operands to instructions like ADD then involve indirection. Have you seen the article about the Altair in the latest Make magazine? - Warren
I'd say that's completely dependent on how many and what type of instructions you're thinking of implementing. I think the first step is to identify what your operations will be, for example: Logical AND Logical OR Logical NOT Shift Left Shift Right Compare a Number With 0 Compare Two Numbers Addition Subtraction Multiplication Division These are the ones i can think of right now. You should also think about the number representation you'll use. For example, are you going to allow for negative numbers, floating point numbers? Floating point is probably a little much, but you can have negative numbers by using a Two's Complement representation. Once you have settled on these you should go over their respective circuits (i would look up the most efficient algorithms for addition and multiplication, for example). This will give you an idea of how many gates you'll need, the kind of board you'll need, etc. By the way, have you thought about what clock you're going to use? What do you mean by building a computer with "alternate materials"?
I specified this in my OP: two single digit numbers, multiplied together. That's it. I'll add one more stipulation: non-negative. Berkeman's link: http://dept-info.labri.u-bordeaux.fr...orial/Dir.html covers more of this than I could possibly have hoped to find in one place. The principles of computing are more universal than electronics. As long as you have ways of holding (storing) two states (i.e. 1 and 0), and a way of changing those states, you can build a device that computes. You could build it with water from a stream. If you have ways of holding and releasing discrete units, you can make a water-powered computer (albeit it will be very, very slow, and very, very large). In a SciAm mag years back, they claimed to have unearthed just such a device at an archeological dig. It was, of course, an April Fool's joke, nonetheless, the principle is sound.