Register to reply

What language do computers understand?

by Euclid
Tags: computers, language
Share this thread:
Euclid
#1
Aug30-07, 11:28 AM
P: 214
I was reading about how to build a computer by yourself. The final step is installing the operating system.
I was wondering, how does the operating system give instructions to the computer? When I write a program in java, for example, I rely on the operating system as a crutch to pass my instructions along. How do I talk directly to a computer?
What if I wanted to build my own operating system? Let's say I wanted to make my computer one glorified calculator. So after I install my OS, every time I turn on my computer, it asks me for two numbers, adds them, displays the answer and asks for two more numbers. Is this hard to do?
Thanks!
Phys.Org News Partner Science news on Phys.org
An interesting glimpse into how future state-of-the-art electronics might work
Tissue regeneration using anti-inflammatory nanomolecules
C2D2 fighting corrosion
DaveC426913
#2
Aug30-07, 11:39 AM
DaveC426913's Avatar
P: 15,319
Computers only understand one language: machine langauge: 01011101.

Humans write programs in a 2nd generation language: assembly language (JMP 0061 00, MOV 2A1B 21A1, etc ). This forms a bridge that (a very few) humans can undestand and the computer can get instructions from. Assembly has very few instructions in it to perform operations, doing very little except the housekeeping required to pass individual bytes from memeory unit to memory unit. The most complex code is built up from countless of these assembly language instructions passing bits and bytes from location to location, or storage to processor, or combining them.

A third generation language is one where we write in at least moderately legible code. It goes on up to fifth gen, where the programmers are more and more abstracted from the actual computer, and lower gen languages do much of the manual labour.


So, look up 'machine langauge' and 'assembly language'.


And yes, it's very hard to do. Partly because it is very rapidly becoming a lost art.
Euclid
#3
Aug30-07, 01:55 PM
P: 214
Hi DaveC426913,
Thank you for the response.
This is very interesting. How hard is very hard? Roughly speaking, would it take me days, weeks, months or years to get up to speed?
What if I assume the existence of some operating system? Can I use the OS to directly manipulate my memory? By this I mean, can I look directly at a section of the memory and see what is stored there and then change some 0 to a 1? Or does the typical OS restrict this type of direct manipulation (maybe for safety purposes)?

DaveC426913
#4
Aug30-07, 02:17 PM
DaveC426913's Avatar
P: 15,319
What language do computers understand?

Quote Quote by Euclid View Post
This is very interesting. How hard is very hard? Roughly speaking, would it take me days, weeks, months or years to get up to speed?
Like anything, it's entirely proportional to how complex you want to get.

To get the gist, I'll bet you can download some sort of bit editor that will allow you to view and edit your memory registers.

To do anything useful might require a book on the subject.

Quote Quote by Euclid View Post
What if I assume the existence of some operating system? Can I use the OS to directly manipulate my memory? By this I mean, can I look directly at a section of the memory and see what is stored there and then change some 0 to a 1? Or does the typical OS restrict this type of direct manipulation (maybe for safety purposes)?
Yes, you can do that.

This was very popular years ago with the older computers like Commodores.

Um, I'm not going to be able to point you to the right people to ask about this, but Wiki articles on the subject - particularly the external references at the bottom of the articles - might set you in the right direction.
mgb_phys
#5
Aug30-07, 02:17 PM
Sci Advisor
HW Helper
P: 8,953
Generally operating systems limit the memory that a user program can alter - for safety as you say.
Earlier PC systems like DOS/Win3.1.Win95 didn't and a faulty program could crash the machine.

An operating system also needs a way of controlling the hardware.
In simple systems like DOS this is done by the BIOS, which is the minimalist system that loads from a ROM chip when your machine starts - it's what prints the text on the screen about number of disks and memory size before windows starts.
You can write DOS programs to directly talk to this bios to read the keyboard and send characters to the screen quite easily. See Peter Norton's books about Assembly language programming.

Writing a modern operating system like Linux took a year to get a basic system started. It was originally based on a simple teaching verion of Unix called Minix - if you are interested in undersanding operating systems you should look at this, it is written by Andrew Tannenbaum and is free.
chroot
#6
Aug30-07, 02:21 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
You can write your own "operating system" capable of accepting keyboard input and displaying something on a VGA screen in about three or four pages of assembly code. It might take you a week if you're new to it; only hours if you're more experienced. Many university computer engineering programs have classes that do exactly these kinds of things.

Get yourself a book on x86 assembly and PC architecture, and it will probably give you everything you need to know. All that you're going to do is write calls to the machine's BIOS to do your keyboard input and text output. Calling the BIOS basically just means stuffing numbers into specific registers and activating interrupts.

- Warren
eieio
#7
Aug30-07, 02:29 PM
P: 28
Check out http://www.osdev.org/wiki/Main_Page which has a decent collection of beginner material, and even some advanced material.
rcgldr
#8
Aug31-07, 01:23 AM
HW Helper
P: 7,110
You don't need an operating system for the simplest of programs. Intel used to include a "traffic signal" program at the back of the 8051 cpu manual, where there was no memory, just the registers. The program would just loop a fixed number of times to delay for so many seconds, and it could "read" sensors via pin inputs on the 8051 to switch the traffic signal based on sensors and elapsed time.

What makes a computer (versus a calculator) is the ability to do conditional jumps: test for something, then conditionally jump to a new set of instructions, which gives a computer the ability to "make decisions" and "act" upon them.

Here's a link that describes the kinds of basic circuits used in computers, the "latch" being one of the important ones, since it functions as a type of "memory". If you go to the home page, you'll find more basic computer stuff.

http://www.play-hookey.com/digital

Regarding the generation of languages, here's a Wiki link:

http://en.wikipedia.org/wiki/First-g...mming_language

What Wiki shows at an alternate definition to "5th generation language", GUI input, source code output, used to be called "4th generation language". If I remember correctly, Think C was one of the first of these that was widely available, made for the MacIntosh, which otherwise (usng MPW - Macintosh Programmers Workshop) was a very programmer "un-friendly" environment (until OS-X).

update - Depending on the computer, there's a layer below "machine language". It used to be called micro code befor "micro processor" became a popular term. On a cpu with a limited instruction set, such as a AMD 2901 bit slice cpu that older mini-computers were based on, a typical implementation of the "opcode" of an instruction was to index an 80 bit wide array. In this array, each bit triggered an operation. There were only 80 possible cpu operations (copy A to B, copy B to A, add A to B, xor B to A, ...) so this scheme was reasonably efficient. On other machines, the microcode more closely represents yet another instruction set.

On main-frames, there may be a lot of machine language instructions in the instruction set, such as a IBM 360, which the 390 is a derivative of. These include instructions that can do math on variable legth BCD (binary coded decimal fiels, 4 bits per digit", and even copy / format the BDC fields into a byte oriented field. Now few of the older IBM mainframes truly implemented all these machine language instructions and instead a "trap" would occur and the instruction would be emulated.

Intel cpu's have a lot of instructions as well. Such as the ability to add an immediate value to a memory location without using any registers.


Register to reply

Related Discussions
Using the language of mathematics, state and prove that mathematics is a language General Math 40
How do I understand language? General Discussion 0
Old Computers Computing & Technology 17
IQ with Computers Social Sciences 7
Computers have changed a lot since 1997 Computing & Technology 0