Understanding Computer Languages: A Layman's Guide

  • Thread starter Thread starter jackson6612
  • Start date Start date
  • Tags Tags
    Computer
AI Thread Summary
Boolean algebra serves as a foundational concept in computer science, underpinning the logical operations performed by computers. While it is not the direct basis for all programming languages, it is integral to the operation of electronic devices, allowing for the execution of arithmetic and logical tasks. Computer languages function by providing a set of symbols and instructions that a computer can interpret, translating high-level code into machine-readable binary instructions through processes like assembly and compilation. This translation allows programmers to write more intuitive code without needing to understand the intricate details of the computer's architecture. The discussion also highlights the evolution of programming from manual input methods to modern high-level languages, emphasizing the role of logical gates and instruction sets in enabling computers to process and execute commands efficiently.
jackson6612
Messages
334
Reaction score
1
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.
 
Technology news on Phys.org
All computer languages I can think of need boolean algebra to function, I wouldn't say it forms a basis. It is a way to write instructions to a computer - to perform a calculation; draw something; tell a device connected to it do something (output), possibly dependent on another device connected to it (input).

A simple analog would be a food recipe - it provides instructions for making a dish. In this case the cook is the computer "executing" the provided steps. Pushing the analogy a little you could say that you transform the ingredients (data) using tools (software libraries and functions) and the result should be predictably the same (output) every time if you just follow those instructions carefully.

Hope that helps :-)

-S
 
jackson6612 said:
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.


The wikipedia into article is pretty good:

http://en.wikipedia.org/wiki/Computer_languages

.
 
A computer doesn't understand the symbols. It just reacts to them.
Take a look at this very simple machine

It can add binary numbers but that doesn't mean that it understands what numbers are.
 
Last edited by a moderator:
jackson6612 said:
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.


Hey there Jackson.

Boolean algebra is the basis behind computation. When you have transistors in certain configurations you can build logical devices that do boolean operations. You can then use these to do all your arithmetic operations and other computational operations.

Typically what happens is that you have what is called a platform. An example is for example x86 which is the platform used in the majority of PC's.

The platform has a specific architecture and a specific instruction set. The architecture basically recognizes binary code and executes the code in the binary form.

Now the first step above binary code is assembler code. Instead of having to write the code using ones and zeros, you can write code that has symbols for each instruction, labels for positions in code, and you can write constant numbers or strings (text) and the assembler will turn your code into binary that the computer understands.

A higher level language like say C/C++ or BASIC let's you write code that's easier to read and understand. In assembler for every machine instruction you write an assembler instruction. With languages like C/C++ and BASIC you can write more intuitive instructions and the compiler turns them into the machine code that the CPU can recognise.

I don't know if this answers you're question, but at the lowest level on the CPU is done using logical gates because logical gates are used to do everything from arithmetic to decision making.
 
jackson6612 said:
Is this true Boolean algebra is basis for all the languages?
Boolean logic is the base logic used to create the lowest level components of a computer, such as the arithmetic component, which performs math operations like add, subtract, multiply, divide. Languages for computers don't directly deal with this boolean logic, but instead with the instruction set and operands (data) that a particular computer uses.

How does a computer language function in general?
A computer language is translated into the native instructions and data (operands) used by a particular computer. It allows a person to specify a sequence of logical steps and define the data, inputs, and outputs to be used, without having to get into the lowest level of computer specific details.

The oldest computers had to be manually wired to perform a specific set of operations. On some old computers, a person could enter a program by toggling switches that represent 0's and 1's to enter commands and data into the computer's memory using a "front panel". Some types of computers used "plug boards" for programming, where the plug board was programmed by plugging in a large number of wires:

http://en.wikipedia.org/wiki/Plugboard

For machine level programming, a very low level assembler could read some form of human input, via paper tape or punched cards, allowing an operation code to be specified as a few letters, and data operands to be written in hex or octal. The person was burdoned with having to determine the locations in memory for instructions and data, and had to rewrite the program if instructions or data were moved around.

The next step up in assembler languages allows a group of letters to be used as names for data or instruction locations (labels), and the assembler can assign the actual addresses or offsets to be used when it converts the assembly language program into machine language. This allowed instructions and data to be moved without having to rewrite programs.

More advanced assemblers include macros, where a single line of macro code could be expanded into multiple machine language instructions. For IBM 360 based mainframes, the simple assembler was called BAL (Basic Assembly Language), and the macro version called ALC (assembly language compiler). Generally there was a library of common macros provided by the computer maker so that programmers wouldn't have to recreate their own set for common operations.

Compiled languages were the next step. Here a single line of code may consist of a mathematical expression that results in many machine level instructions.

The next step up from this are code generators that allow a person to draw a user interface using the tools provided by the code generator, typically "drag and drop". For example a person selects a "dialog box" and drags into the desired location on what will become the user interface for the program. Once this is done, the code genreator creates the code to be used by a compiler to produce the user interface part of the program code. The person then adds the non-user interface code between the generated code to create the completed program. The code generator can tell the difference between code it generated and code added by a programmer so that updating the user interface and/or non-interface code is not too difficult.
 
Last edited:
computers are really fast deterministic (0 and 1, False and True) calculators

CS is designing ways to do things with the speed
 
Back
Top