Computer languages tend to be transient

  • #1
jim mcnamara
Mentor
4,015
2,452
I want to make a point about computer languages. Sometimes it is more effective to show some examples, then present the concept.

This is a list of BASIC (computer language)-- take a peek it goes on forever....
See:
https://en.wikipedia.org/wiki/List_of_BASIC_dialects

Way too many. Most of these implementations are moribund because were or are highly specific to devices or operating systems. Or had limited use. Unless you are an old guy like me, most of this list is essentially meaningless.

So, why is the list so very long? One reason:
This is the sad tale of a pretty good BASIC compiler for DOS and Windows:
https://en.wikipedia.org/wiki/PowerBASIC

So how many languages have at least some modern currency? ~800 maybe, assuming Rosetta Code has any meaning.
See:
http://www.rosettacode.org/wiki/Rosetta_Code
BTW:
This site is great for finding algorithms in a language - try searching for 'bubble sort'.

Other than specialty machines and devices, more general usage is controlled by who runs it for business or research and therefore is invested in it long term. That entity will be unable to operate a business or continue research easily if the language hits a failure point because of a language bug.

Do you think that a 'customer profiling language' written by Joe Btfsplk##^{13}## is going to be a mainstay at the Bank of England? No.

Why? Generally they prefer languages like COBOL that have ANSI specifications and have passed certifications. Joe cannot afford to do that.

COBOL is VERY old. The first COBOL program ran on 17 August 1959 on an RCA 501. It is not going away.

Some languages like Ruby have 'crowd sourced' specifications and test suites and certifications. Joe could afford that. All he has to do is get several hundred really bright people to love his product. And drop what they are currently working on.

Computer Science curricula all seem to have a requirement of one or more of these:
1. Write a shell - like bash or DOS
2. Write a compiled language or an interpreted one. Google for 'small C'
3. Write a device driver
I was an advisor on UNIX for 15 years. An awful lot of questions about 'how to start' 1, 2, or 3 appeared every year.

What this does is to create thousands new programs every year. Like snowflakes in Minnesota storms.

So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out. The other factor is choice. There is research that indicates excessive choices impair good decision making.

'Testing the boundaries of the choice overload phenomenon: The effect of number of options and time pressure on decision difficulty and satisfaction.'
GA Haynes - Psychology & Marketing, 2009 - psycnet.apa.org
Abstract
The number of alternatives available to people in many day-to-day decisions has greatly
increased in Western societies. The present research sought to build upon recent research
suggesting that having large numbers of alternatives can have negative results…
##13## "Li'l Abner" by Al Capp .. and now you know why I used 13....
 
  • Like
Likes sysprog

Answers and Replies

  • #2
577
292
You seem to have answered all your own questions, @jim mcnamara, so we can close this thread now 🤣

Seriously, I'd guess that the barrier to creating a new language - or forking an existing one - is low, so people like your Joe Btfsplk can readily 'give it a go' (as I did for a money market trading platform while working at a bank early in my career).

If they solve some common problem, or perhaps just surf the wave of a fad, or are promoted by a sufficiently large industry player, they'll catch on and spread. Otherwise, they'll make your Wikipedia obsolete list. But most are not like COBOL, they perish much faster than that!

In fact, there is probably an epidemiological model for computer languages that takes into account the hardware ('host') and attributes such as propagation rate - and others, I can't think of appropriate analogies right now - that allow you to estimate lifecycle and possibly how long it will hang around for.
 
  • #4
12,025
5,677
In my search for the "perfect" computer language, I've visited:
- Fortran Y a precursor to Fortran 77 featured the character datatype
- COBOL
- GE-6000 Macro Assembler (GMAP)
- TEX (one of my favorites, so much flexibility)
- GE TSS Basic then TRS-80 Basic then Commodore Basic
- PASCAL (Comp Sci course)
- LISP via Franz Lisp for (Comp Sci course on AI)
- C then CFRONT(C++ front end to C) then C/C++ via Turbo C/C++
- AWK (one of my favorites)
- tried Perl (quirky syntax, more powerful REGEX), early Python 2 (didn't like tabbing) and Ruby
- IBM REXX fun to program in
- Prolog (paradigm different from procedural more descriptive) via Turbo Prolog
- Forth (like Lisp cryptic but without parentheses)
- SQL (one statement could be a whole program, one guy wrote SQL that wrote more SQL to run)
- C/C++ on OS/2 and AIX on Taligent (painful lacking an IDE to discover callable methods)
- Java
- Jython Python on Java with access to all Java libraries
- Javascript for web pages and servers via node-js
- Groovy scripting language better Java than Java
- Elm interesting but seemed limited to web only
- Scala extended Java but always seemed to change with each version
- Clojure scripting was basically Lisp on Java with access to all of Java libraries ( didn't really catch my interest the paradigm is more restrictive and yet flexible too)
- Processing Java IDE for prototyping graphical ideas
- Python 2/3 via Anaconda distribution and Pycharm IDE (easy to program, flexibile but slow and bulky) the language of choice for ML projects
- Go basically C/C++ reimagined by the original users of C from Bell Labs
- Matlab very interesting paradigm
- Julia the language of the future ie could well replace Matlab making inroads into Python ML work
- Kotlin as a better Java than Java

Of these languages so far:
- TEX the first language that was really different to me. It was a language built on a line editor command set and could call or got across tex programs. All variables were global. But only worked on a Honeywell 6000 via a TSS session.
- I turn to AWK for one off programs or prototypes then may convert the AWK to python as it gets more complex
- Prolog for AI over Lisp
- REXX scripting like Python but worked on IBM VM systems not as well on PC DOS systems
- Java for everyday work via Netbeans IDE solid environment, good IDE, rather boring now
- Processing Java IDE just a magical experience of drawing cool interactive pictures.
- Python for my utility programs (sometimes migrated from AWK)
- Julia for ML and math programming
- Go for utility programming now (compiles to binaries available on many platforms)
- Kotlin to make Java work more interesting if only the team would agree

It seems my interest was in finding the best paradigm. Tex had the editor command set, Awk had the REGEX, Prolog the descriptive feature, REXX had the simplicity of coding, Java the power of true cross-platform programming, Processing Java IDE for fun, Python 3 better simplicity of coding, Julia for powerful general purpose math programming and now Go getting back to the basics of C programming once again.

And the winner is (stay tuned as I haven't retired yet)
 
  • #5
1,649
993
So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out.
As I see it it's a bit of like in biology/evolution. Specie 'creation' through big jumps is not something what succesfully happens very often: usually slow evolution around the old basis is the way.

And then when a new land is opened up, species starts to boom - like internet, neural network and so, bringing forth new, specialized languages :wink:
 
  • Like
Likes jedishrfu
  • #6
12,025
5,677
Yeah, I remember the driving force is usually a young programmer who is dissatisfied with the current crop of languages and forges a "new" better language. \

As an example:

SNOBOL for text processing
AWK for better text processing
PERL because the author didn't like AWK limitations
PYTHON because the author didn't like PERL
RUBY because the author didn't like PYTHON and tabbing
...

or

Smalltalk for OO programming
Objecttive C giving C OO features
C/C++ because C was good for systems programming and OO gave it more power and the author didn't like Objective C
JAVA because the authors didn't like C/C++ multiple inheritance
GROOVY and SCALA because JAVA was too verbose and not flexible enough
and now KOTLIN as a truly better JAVA than JAVA

or

FORTRAN for engineering problems
APL for array processing and brevity via mathlike symbols
MATLAB because FORTRAN didn't do vector processing and APL was too arcane and needed its own keyboard
JULIA because MATLAB is proprietary and costs way too much for companies but its so darn useful

Hence the quote from George Bernard Shaw:

“The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”

― George Bernard Shaw, Man and Superman
 
  • Like
Likes Klystron and anorlunda
  • #7
anorlunda
Staff Emeritus
Insights Author
8,711
5,601
When the Defense Department announced the Ada language, they said (my paraphrase) "In the history of DOD every project invented a new programming language. The exception was Jovial which was used on exactly two projects. Ada is intended to be the last word in computer languages."

Well, Ada did and does exist, but that public statement sounds very foolish in retrospect.

Even more persistent are new threads like this one offering the opportunity for programmers to talk about which languages they like or don't like. For example, my two cents worth.
  • If I was forced to read someone else's source code, Pascal was a marvel for clarity of intent, even without the comments.
  • I wrote more FORTRAN than anything else, and I even contributed to the Fortran 77 ANSI standard, but I hated it every step along the way. IMO, FORTRAN sucks. Most of the FORTRAN programs I wrote would have been much better in C, but FORTRAN was mandated.
  • The most fun programming I ever did was Robot Wars, using the Robot Wars "machine language". I competed in Robot Wars tournaments where contestants mailed in their programs, and the winner was the last surviving robot.
  • Another fun experience was a training simulator on the GEPAC4020 process computer. The models were written and initially tested in FORTRAN floating point, hand compiled into fixed point assembler, then debugged, tuned and patched in binary.
  • My most inspiring programming experience was reading Bjorn Stroustrup's book The C++ Programming Language, 1.0. Not writing C++, but reading that book.
 
  • #8
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2019 Award
25,141
8,254
I'm going to disagree with the title's premise. The first high level language was FORTRAN, from 1957. Sixty-three years later, it's still in use. More amazingly, it was developed for a 12000 flop machine, and will run on the first exeflop machines when they appear next year: a factor of more than 80 trillion in performance.

COBOL is still around, and as mentioned, isn't going anywhere.

BASIC is a special case. The B stands for "Beginners". The intent is for people to move on. Many of the 800 are attempts to "fix" BASIC so people don't have to move on. Removing some of the idiosyncrasies like confusing combining the text editor and interpreter and adopting a more structured approach turns it into Pascal with BASIC keywords.

One might make the argument that there is not much desire for that. Clearly ~800 people feel there should be a desire for that.

C is a mere 48 years old. A whippersnapper. And SQL? Just a kid at 48.
 
  • Like
Likes Klystron, phinds and jim mcnamara
  • #9
anorlunda
Staff Emeritus
Insights Author
8,711
5,601
like confusing combining the text editor and interpreter
Isn't that what we call IDE, integrated development environment? For a short while in the 90s, Visual Basic had an IDE so superior, that it made the choice of underlying language relatively unimportant. Then, other IDEs for other languages caught up.
 
  • #10
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2019 Award
25,141
8,254
One could argue BASIC was an early attempt at an IDE, but the idea of mingling editing line numbers with execution line numbers was perhaps not the best idea the field ever had. As a side effect, it encouraged GOTO as a control structure when it is not really appropriate.
 
  • #11
Lots of good information on this thread. As someone who was a computer science professor for a few years, students mostly want the language that is currently most popular. That's the difference between a coder and a computer scientist. A coder wants a quick class in the most popular language and then will need to be retrained every few years. A computer science major wants to learn as many languages as possible, whether popular or not. And job interviews emphasize data structures and usually let you code for the interview in whatever language you want. I would always start teaching my classes with this information and tell my students on the first day that I make my language and data structure classes as hard as I can. The rest of my classes were moderate. The second day of class I would always have 10% less enrollment. The MFT, Major Field Test in computer science is a good measure of the graduating students national standing.
 
  • #12
Filip Larsen
Gold Member
1,274
209
An oldie but goldie that seems to apply well here too (using s/standard/programing language/gi):

standards.png
 
  • Haha
  • Like
Likes Keith_McClary, DaveE, vela and 3 others
  • #13
242
99
To reinforce some earlier posters. While studying for a Computer Science major it should not be uncommon to ask your students to define at least one or a few computer languages (also
file systems, process scheduling algorithms, etc) as a teaching tool. The success rate in the wider world is abysmal.
 
  • #14
503
197
That's the difference between a coder and a computer scientist.
Don't forget that "coding" used to mean punching up the cards/tape, not "programming". I suspect the modern definition is a jibe at the cut'n'paste section of the industry.
 
  • #15
33,981
5,639
Don't forget that "coding" used to mean punching up the cards/tape, not "programming". I suspect the modern definition is a jibe at the cut'n'paste section of the industry.
I don't think so. I've done programming using a keypunch machine, and I've done coding on a computer screen. IMO, the two terms are more-or-less synonymous, with nothing to do with cut-and-paste.
 
  • Like
Likes Klystron and jedishrfu
  • #17
George Jones
Staff Emeritus
Science Advisor
Gold Member
7,385
1,005
Lots of good information on this thread. As someone who was a computer science professor for a few years, students mostly want the language that is currently most popular.
Interesting video on evolving popularity of computer languages:

 
  • Like
Likes Tom.G, Wrichik Basu, Filip Larsen and 1 other person
  • #18
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
16,325
6,506
WOW! That is one cool video. Thanks for posting.
 
  • Like
Likes jedishrfu
  • #19
503
197
I don't think so. I've done programming using a keypunch machine, and I've done coding on a computer screen. IMO, the two terms are more-or-less synonymous, with nothing to do with cut-and-paste.
My bad ; my brain tends to drop bits these days.

My point was that you actually didn't have to be able to type in order to be a programmer.
 
  • Haha
Likes jedishrfu
  • #20
33,981
5,639
My point was that you actually didn't have to be able to type in order to be a programmer.
Punching holes in cards was done with a keypunch machine, which had a standard typewriter keyboard, so of course you had to type in order to write programs. I took two programming classes, the first back in '72 and another in '80. Both used computer systems that took IBM (or Hollerith) cards as input. The way it worked was that several program decks ("jobs") were collected, and fed into a card reader, which then transcribed the contents on to a big tape reel. The tape real was then mounted on the mainframe, and the jobs were compiled and run.

Here's a picture of a keypunch machine. The ones I used looked pretty much like this. BTW, the first class used PL/C, which seems to have gone by the wayside, but the second class used Fortran 77, which in newer versions is still very much alive.

keypunch-machine.jpg
 
  • Like
Likes Wrichik Basu
  • #21
DaveE
Gold Member
791
596
Oh, I just had a flashback to an argument I (EE) had with the CS guy for a real world HW product UI/Controller. He wanted to use Smalltalk on OS2, because it was cool at the time and he hated Microsoft. I wanted ANSI C/C++ on Windows, because I wanted a good supportable product. He won, then the product died an early death.
 
  • Haha
Likes jedishrfu
  • #22
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
16,325
6,506
My point was that you actually didn't have to be able to type in order to be a programmer.
HUH ??? Perhaps you mean that you don't need to be a touch typist but surely you can't really believe that you can, or ever could, write computer code without typing?
 
  • Haha
Likes jedishrfu
  • #23
237
481
HUH ??? Perhaps you mean that you don't need to be a touch typist but surely you can't really believe that you can write computer code without typing?
Hunt'n'peck with lots of copy'n'paste.

Perhaps filling out these was "coding"?
FortranCodingForm.png
 
  • #24
1,488
1,339
Java for everyday work via Netbeans IDE solid environment, good IDE, rather boring now
Back in school, we learnt Java for four years. In the last year, we had to submit a huge project consisting of thirty programs (I submitted a total of 310 pages). In school, we used BlueJ, which was great because other schools were using Notepad. At home, I wrote all my programs in NetBeans. It was a wonderful experience back then - I could easily look at the documentation, and sometimes even peek at the full source code itself.

When I joined college, I wrote my first GUI application in Java Swing using NetBeans. The GUI was not complicated, but the programming was, because I had to work with date and time, and native Java, as we know, does not provide a date picker or time picker like Android. So I had to design all that using drop down menus. It was a tedious task, but during that project, I learnt to use Java 8 Time libraries.

Then I got interested in Android. It took some time to actually start programming in Android, because our 32-bit desktop was unable to run Android Studio, and I had to wait for my laptop. Soon it was time to learn how to manage a project with Gradle. Especially when one is using third party libraries, Gradle is the way to go because it is too time-consuming (albeit not impossible) to manage libraries manually. And then I found how easy coding was with the IDEs released by JetBrains - type one or two letters and you get a whole list of suggestions. It really reduced the coding time. Then I discovered Intellij IDEA. Although I don't do too much coding in native Java nowadays, if I have to make an application, I always look up to IntelliJ.

I have plans to learn JavaFX soon, and Gradle will come handy because JavaFX libraries are no longer packaged with the JDK nowadays.
 
  • #25
12,025
5,677
The memories you all bring back. I remember transcoding some handwritten notes to punch card sheets so I could print them out via the line printer and on microfiche.

The keypunchers really liked the change of pace as typing words was easier than typing cryptic programming code (depending on the language of course) because they didn't need to worry as much about accuracy. It was govt work aka personal project and they helped with the bulk typing work.

The lead keypuncher was amazingly fast and would routinely overload her machine. One day a salesman set up a new machine that he said was a state of the art keypunch machine guaranteed to handle any speed. She sat down to test it and within seconds she had to wait while the machine struggled to finish punching the cards she entered. The salesman was somewhat embarrassed.

I also remember not being a touch typist but yet typing with the classic two-finger pecking and later naturally using a few more fingers in my own typing style. One programmer was very fast on the teletype machine due to his prior job of being a court stenographer. It was a time when anyone with the interest and aptitude could be hired as a programmer and trained by the company.
 
Last edited:
  • Like
Likes Keith_McClary and Wrichik Basu

Related Threads on Computer languages tend to be transient

  • Last Post
3
Replies
55
Views
12K
Replies
4
Views
3K
Replies
7
Views
48K
Replies
40
Views
2K
  • Last Post
Replies
7
Views
2K
Replies
8
Views
1K
  • Last Post
Replies
18
Views
6K
  • Last Post
Replies
10
Views
2K
  • Last Post
Replies
2
Views
3K
  • Last Post
2
Replies
33
Views
4K
Top