- #1
- 4,744
- 3,756
I want to make a point about computer languages. Sometimes it is more effective to show some examples, then present the concept.
This is a list of BASIC (computer language)-- take a peek it goes on forever...
See:
https://en.wikipedia.org/wiki/List_of_BASIC_dialects
Way too many. Most of these implementations are moribund because were or are highly specific to devices or operating systems. Or had limited use. Unless you are an old guy like me, most of this list is essentially meaningless.
So, why is the list so very long? One reason:
This is the sad tale of a pretty good BASIC compiler for DOS and Windows:
https://en.wikipedia.org/wiki/PowerBASIC
So how many languages have at least some modern currency? ~800 maybe, assuming Rosetta Code has any meaning.
See:
http://www.rosettacode.org/wiki/Rosetta_Code
BTW:
This site is great for finding algorithms in a language - try searching for 'bubble sort'.
Other than specialty machines and devices, more general usage is controlled by who runs it for business or research and therefore is invested in it long term. That entity will be unable to operate a business or continue research easily if the language hits a failure point because of a language bug.
Do you think that a 'customer profiling language' written by Joe Btfsplk##^{13}## is going to be a mainstay at the Bank of England? No.
Why? Generally they prefer languages like COBOL that have ANSI specifications and have passed certifications. Joe cannot afford to do that.
COBOL is VERY old. The first COBOL program ran on 17 August 1959 on an RCA 501. It is not going away.
Some languages like Ruby have 'crowd sourced' specifications and test suites and certifications. Joe could afford that. All he has to do is get several hundred really bright people to love his product. And drop what they are currently working on.
Computer Science curricula all seem to have a requirement of one or more of these:
1. Write a shell - like bash or DOS
2. Write a compiled language or an interpreted one. Google for 'small C'
3. Write a device driver
I was an advisor on UNIX for 15 years. An awful lot of questions about 'how to start' 1, 2, or 3 appeared every year.
What this does is to create thousands new programs every year. Like snowflakes in Minnesota storms.
So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out. The other factor is choice. There is research that indicates excessive choices impair good decision making.
'Testing the boundaries of the choice overload phenomenon: The effect of number of options and time pressure on decision difficulty and satisfaction.'
GA Haynes - Psychology & Marketing, 2009 - psycnet.apa.org
Abstract
##13## "Li'l Abner" by Al Capp .. and now you know why I used 13...
This is a list of BASIC (computer language)-- take a peek it goes on forever...
See:
https://en.wikipedia.org/wiki/List_of_BASIC_dialects
Way too many. Most of these implementations are moribund because were or are highly specific to devices or operating systems. Or had limited use. Unless you are an old guy like me, most of this list is essentially meaningless.
So, why is the list so very long? One reason:
This is the sad tale of a pretty good BASIC compiler for DOS and Windows:
https://en.wikipedia.org/wiki/PowerBASIC
So how many languages have at least some modern currency? ~800 maybe, assuming Rosetta Code has any meaning.
See:
http://www.rosettacode.org/wiki/Rosetta_Code
BTW:
This site is great for finding algorithms in a language - try searching for 'bubble sort'.
Other than specialty machines and devices, more general usage is controlled by who runs it for business or research and therefore is invested in it long term. That entity will be unable to operate a business or continue research easily if the language hits a failure point because of a language bug.
Do you think that a 'customer profiling language' written by Joe Btfsplk##^{13}## is going to be a mainstay at the Bank of England? No.
Why? Generally they prefer languages like COBOL that have ANSI specifications and have passed certifications. Joe cannot afford to do that.
COBOL is VERY old. The first COBOL program ran on 17 August 1959 on an RCA 501. It is not going away.
Some languages like Ruby have 'crowd sourced' specifications and test suites and certifications. Joe could afford that. All he has to do is get several hundred really bright people to love his product. And drop what they are currently working on.
Computer Science curricula all seem to have a requirement of one or more of these:
1. Write a shell - like bash or DOS
2. Write a compiled language or an interpreted one. Google for 'small C'
3. Write a device driver
I was an advisor on UNIX for 15 years. An awful lot of questions about 'how to start' 1, 2, or 3 appeared every year.
What this does is to create thousands new programs every year. Like snowflakes in Minnesota storms.
So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out. The other factor is choice. There is research that indicates excessive choices impair good decision making.
'Testing the boundaries of the choice overload phenomenon: The effect of number of options and time pressure on decision difficulty and satisfaction.'
GA Haynes - Psychology & Marketing, 2009 - psycnet.apa.org
Abstract
The number of alternatives available to people in many day-to-day decisions has greatly
increased in Western societies. The present research sought to build upon recent research
suggesting that having large numbers of alternatives can have negative results…
##13## "Li'l Abner" by Al Capp .. and now you know why I used 13...