Question about inventing a language program

AI Thread Summary
The discussion centers on the origins of FORTRAN and the concept of compilers, highlighting that the creation of a programming language like FORTRAN involves developing a compiler that translates high-level syntax into machine code. The initial compiler must be written in the computer's native machine instruction set, which consists of binary instructions that the CPU recognizes. This raises questions about how the first compiler was created and whether it required a specific computer architecture to function. The conversation also touches on the historical context of programming languages, noting that COBOL predates FORTRAN and that Grace Murray Hopper contributed significantly to the development of programming languages. An example from the early 1970s illustrates how programming was done using physical switches to input machine instructions directly into memory. The discussion emphasizes the foundational role of native instruction sets in the evolution of programming languages and compilers.
fluidistic
Gold Member
Messages
3,928
Reaction score
272
I am wondering how the man that invented FORTRAN did so.
More precisely :
It is not really the language that he invented but more likely a compiler that would understand some syntax that we call "language". So how did he write its compiler? In order to make it work it necessitate a compiler itself, or in other words, the compiler is a program itself. Am I right?
So if I continue... how was invented the first program? How was it writed I mean, and compiled or executed...
 
Technology news on Phys.org
the first compiler (or similar "language" tool) for any computer has to be written in that computer's native machine instruction set. once you have implemented any 'general purpose' programming instruction set (and yes, the compiler is itself just a program that translates one kind of command into raw machine instructions)...once you have the first compiler, other languages can be implemented for the same computer with less effort by using the first language to program the second language's compiler.
 
and incidentally, FORTRAN was not the "first" high-level, general purpose programming language. COBOL was. and, the concept of creating languages like FORTRAN was dreamed up by a woman (Grace Murray Hopper).
 
Thank you very much for the information.
But still I wonder about
the first compiler (or similar "language" tool) for any computer has to be written in that computer's native machine instruction set.
what is exactly the native machine instruction set? Does that mean that we have to build a computer in such a way that it would recognize a specific syntax? If yes, how can we build that? If no, does that mean that we have to "change" what does the computer in order to make it work as a compiler? If yes, how can it be done?
And yes, I understand that when we already have a compiler and a language, it's easier to "invent" another compiler and language.
 
fluidistic said:
Thank you very much for the information.
But still I wonder about what is exactly the native machine instruction set?

All computers have a native instruction set that their CPUs are "hard-wired" to recognize. In its most fundamental form, an instruction is a string of 0's and 1's (binary bits).

For example (just to make up a hypothetical instruction format on the spot), an instruction might consist of 32 bits, of which the first eight specify the particular operation (add, multiply, jump, etc.), the next four are "modifiers" that specify variations on the basic operation, and the remaining twenty bits are the address (in RAM) of the data that the operation is to act on. Different models of CPU chips have different instruction sets.

When I was an undergraduate in the early 1970s, I played with a computer that could be programmed by setting a row of switches (on the front panel), which represented either a machine instruction or a RAM address, in binary notation. To program it, I first set the switches to represent the RAM address where I wanted to put the first instruction, then pressed a button to tell the machine to use that address. Then I set the switches to represent the actual instruction, and pressed another button to load the instruction at that address. I think it automatically incremented the address by one, so I could enter the instructions one after the other without having to set the address explicitly each time. When I was done, I set the address switches to the address of the first instruction and pressed the "run" button.

Normally, I ran programs that had already been punched onto paper tape in a binary code, by reading them in with a paper tape reader. But before I could do that, I had to load a short program to drive the paper tape reader, by using the switches.

(For any fellow grey-beards who happen to read this, the computer was a Digital Equipment PDP-5, if I remember correctly.)
 
Last edited:
Thanks jtbell, I found what you wrote very interesting.
 
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
I have a quick questions. I am going through a book on C programming on my own. Afterwards, I plan to go through something call data structures and algorithms on my own also in C. I also need to learn C++, Matlab and for personal interest Haskell. For the two topic of data structures and algorithms, I understand there are standard ones across all programming languages. After learning it through C, what would be the biggest issue when trying to implement the same data...
Back
Top