Computer languages tend to be transient

In summary: Java (tried at work but didn't stick around)- Swift (heard good things)- Scala (heard good things)In summary, languages come and go, but COBOL is still around and will likely continue to be for some time.
  • #1
jim mcnamara
Mentor
4,770
3,816
I want to make a point about computer languages. Sometimes it is more effective to show some examples, then present the concept.

This is a list of BASIC (computer language)-- take a peek it goes on forever...
See:
https://en.wikipedia.org/wiki/List_of_BASIC_dialects

Way too many. Most of these implementations are moribund because were or are highly specific to devices or operating systems. Or had limited use. Unless you are an old guy like me, most of this list is essentially meaningless.

So, why is the list so very long? One reason:
This is the sad tale of a pretty good BASIC compiler for DOS and Windows:
https://en.wikipedia.org/wiki/PowerBASIC

So how many languages have at least some modern currency? ~800 maybe, assuming Rosetta Code has any meaning.
See:
http://www.rosettacode.org/wiki/Rosetta_Code
BTW:
This site is great for finding algorithms in a language - try searching for 'bubble sort'.

Other than specialty machines and devices, more general usage is controlled by who runs it for business or research and therefore is invested in it long term. That entity will be unable to operate a business or continue research easily if the language hits a failure point because of a language bug.

Do you think that a 'customer profiling language' written by Joe Btfsplk##^{13}## is going to be a mainstay at the Bank of England? No.

Why? Generally they prefer languages like COBOL that have ANSI specifications and have passed certifications. Joe cannot afford to do that.

COBOL is VERY old. The first COBOL program ran on 17 August 1959 on an RCA 501. It is not going away.

Some languages like Ruby have 'crowd sourced' specifications and test suites and certifications. Joe could afford that. All he has to do is get several hundred really bright people to love his product. And drop what they are currently working on.

Computer Science curricula all seem to have a requirement of one or more of these:
1. Write a shell - like bash or DOS
2. Write a compiled language or an interpreted one. Google for 'small C'
3. Write a device driver
I was an advisor on UNIX for 15 years. An awful lot of questions about 'how to start' 1, 2, or 3 appeared every year.

What this does is to create thousands new programs every year. Like snowflakes in Minnesota storms.

So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out. The other factor is choice. There is research that indicates excessive choices impair good decision making.

'Testing the boundaries of the choice overload phenomenon: The effect of number of options and time pressure on decision difficulty and satisfaction.'
GA Haynes - Psychology & Marketing, 2009 - psycnet.apa.org
Abstract
The number of alternatives available to people in many day-to-day decisions has greatly
increased in Western societies. The present research sought to build upon recent research
suggesting that having large numbers of alternatives can have negative results…

##13## "Li'l Abner" by Al Capp .. and now you know why I used 13...
 
  • Like
Likes BvU, aaroman and sysprog
Computer science news on Phys.org
  • #2
You seem to have answered all your own questions, @jim mcnamara, so we can close this thread now 🤣

Seriously, I'd guess that the barrier to creating a new language - or forking an existing one - is low, so people like your Joe Btfsplk can readily 'give it a go' (as I did for a money market trading platform while working at a bank early in my career).

If they solve some common problem, or perhaps just surf the wave of a fad, or are promoted by a sufficiently large industry player, they'll catch on and spread. Otherwise, they'll make your Wikipedia obsolete list. But most are not like COBOL, they perish much faster than that!

In fact, there is probably an epidemiological model for computer languages that takes into account the hardware ('host') and attributes such as propagation rate - and others, I can't think of appropriate analogies right now - that allow you to estimate lifecycle and possibly how long it will hang around for.
 
  • #4
In my search for the "perfect" computer language, I've visited:
- Fortran Y a precursor to Fortran 77 featured the character datatype
- COBOL
- GE-6000 Macro Assembler (GMAP)
- TEX (one of my favorites, so much flexibility)
- GE TSS Basic then TRS-80 Basic then Commodore Basic
- PASCAL (Comp Sci course)
- LISP via Franz Lisp for (Comp Sci course on AI)
- C then CFRONT(C++ front end to C) then C/C++ via Turbo C/C++
- AWK (one of my favorites)
- tried Perl (quirky syntax, more powerful REGEX), early Python 2 (didn't like tabbing) and Ruby
- IBM REXX fun to program in
- Prolog (paradigm different from procedural more descriptive) via Turbo Prolog
- Forth (like Lisp cryptic but without parentheses)
- SQL (one statement could be a whole program, one guy wrote SQL that wrote more SQL to run)
- C/C++ on OS/2 and AIX on Taligent (painful lacking an IDE to discover callable methods)
- Java
- Jython Python on Java with access to all Java libraries
- Javascript for web pages and servers via node-js
- Groovy scripting language better Java than Java
- Elm interesting but seemed limited to web only
- Scala extended Java but always seemed to change with each version
- Clojure scripting was basically Lisp on Java with access to all of Java libraries ( didn't really catch my interest the paradigm is more restrictive and yet flexible too)
- Processing Java IDE for prototyping graphical ideas
- Python 2/3 via Anaconda distribution and Pycharm IDE (easy to program, flexibile but slow and bulky) the language of choice for ML projects
- Go basically C/C++ reimagined by the original users of C from Bell Labs
- Matlab very interesting paradigm
- Julia the language of the future ie could well replace Matlab making inroads into Python ML work
- Kotlin as a better Java than Java

Of these languages so far:
- TEX the first language that was really different to me. It was a language built on a line editor command set and could call or got across tex programs. All variables were global. But only worked on a Honeywell 6000 via a TSS session.
- I turn to AWK for one off programs or prototypes then may convert the AWK to python as it gets more complex
- Prolog for AI over Lisp
- REXX scripting like Python but worked on IBM VM systems not as well on PC DOS systems
- Java for everyday work via Netbeans IDE solid environment, good IDE, rather boring now
- Processing Java IDE just a magical experience of drawing cool interactive pictures.
- Python for my utility programs (sometimes migrated from AWK)
- Julia for ML and math programming
- Go for utility programming now (compiles to binaries available on many platforms)
- Kotlin to make Java work more interesting if only the team would agree

It seems my interest was in finding the best paradigm. Tex had the editor command set, Awk had the REGEX, Prolog the descriptive feature, REXX had the simplicity of coding, Java the power of true cross-platform programming, Processing Java IDE for fun, Python 3 better simplicity of coding, Julia for powerful general purpose math programming and now Go getting back to the basics of C programming once again.

And the winner is (stay tuned as I haven't retired yet)
 
  • Like
Likes BvU and aaroman
  • #5
jim mcnamara said:
So, my point: new languages may fly for a while but a large number get discarded, especially as hardware limitations get moved further out.
As I see it it's a bit of like in biology/evolution. Specie 'creation' through big jumps is not something what succesfully happens very often: usually slow evolution around the old basis is the way.

And then when a new land is opened up, species starts to boom - like internet, neural network and so, bringing forth new, specialized languages :wink:
 
  • Like
Likes jedishrfu
  • #6
Yeah, I remember the driving force is usually a young programmer who is dissatisfied with the current crop of languages and forges a "new" better language. \

As an example:

SNOBOL for text processing
AWK for better text processing
PERL because the author didn't like AWK limitations
PYTHON because the author didn't like PERL
RUBY because the author didn't like PYTHON and tabbing
...

or

Smalltalk for OO programming
Objecttive C giving C OO features
C/C++ because C was good for systems programming and OO gave it more power and the author didn't like Objective C
JAVA because the authors didn't like C/C++ multiple inheritance
GROOVY and SCALA because JAVA was too verbose and not flexible enough
and now KOTLIN as a truly better JAVA than JAVA

or

FORTRAN for engineering problems
APL for array processing and brevity via mathlike symbols
MATLAB because FORTRAN didn't do vector processing and APL was too arcane and needed its own keyboard
JULIA because MATLAB is proprietary and costs way too much for companies but its so darn useful

Hence the quote from George Bernard Shaw:

“The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”

― George Bernard Shaw, Man and Superman
 
  • Like
Likes aaroman, Klystron and anorlunda
  • #7
When the Defense Department announced the Ada language, they said (my paraphrase) "In the history of DOD every project invented a new programming language. The exception was Jovial which was used on exactly two projects. Ada is intended to be the last word in computer languages."

Well, Ada did and does exist, but that public statement sounds very foolish in retrospect.

Even more persistent are new threads like this one offering the opportunity for programmers to talk about which languages they like or don't like. For example, my two cents worth.
  • If I was forced to read someone else's source code, Pascal was a marvel for clarity of intent, even without the comments.
  • I wrote more FORTRAN than anything else, and I even contributed to the Fortran 77 ANSI standard, but I hated it every step along the way. IMO, FORTRAN sucks. Most of the FORTRAN programs I wrote would have been much better in C, but FORTRAN was mandated.
  • The most fun programming I ever did was Robot Wars, using the Robot Wars "machine language". I competed in Robot Wars tournaments where contestants mailed in their programs, and the winner was the last surviving robot.
  • Another fun experience was a training simulator on the GEPAC4020 process computer. The models were written and initially tested in FORTRAN floating point, hand compiled into fixed point assembler, then debugged, tuned and patched in binary.
  • My most inspiring programming experience was reading Bjorn Stroustrup's book The C++ Programming Language, 1.0. Not writing C++, but reading that book.
 
  • #8
I'm going to disagree with the title's premise. The first high level language was FORTRAN, from 1957. Sixty-three years later, it's still in use. More amazingly, it was developed for a 12000 flop machine, and will run on the first exeflop machines when they appear next year: a factor of more than 80 trillion in performance.

COBOL is still around, and as mentioned, isn't going anywhere.

BASIC is a special case. The B stands for "Beginners". The intent is for people to move on. Many of the 800 are attempts to "fix" BASIC so people don't have to move on. Removing some of the idiosyncrasies like confusing combining the text editor and interpreter and adopting a more structured approach turns it into Pascal with BASIC keywords.

One might make the argument that there is not much desire for that. Clearly ~800 people feel there should be a desire for that.

C is a mere 48 years old. A whippersnapper. And SQL? Just a kid at 48.
 
  • Like
Likes BvU, Klystron, phinds and 1 other person
  • #9
Vanadium 50 said:
like confusing combining the text editor and interpreter
Isn't that what we call IDE, integrated development environment? For a short while in the 90s, Visual Basic had an IDE so superior, that it made the choice of underlying language relatively unimportant. Then, other IDEs for other languages caught up.
 
  • #10
One could argue BASIC was an early attempt at an IDE, but the idea of mingling editing line numbers with execution line numbers was perhaps not the best idea the field ever had. As a side effect, it encouraged GOTO as a control structure when it is not really appropriate.
 
  • #11
Lots of good information on this thread. As someone who was a computer science professor for a few years, students mostly want the language that is currently most popular. That's the difference between a coder and a computer scientist. A coder wants a quick class in the most popular language and then will need to be retrained every few years. A computer science major wants to learn as many languages as possible, whether popular or not. And job interviews emphasize data structures and usually let you code for the interview in whatever language you want. I would always start teaching my classes with this information and tell my students on the first day that I make my language and data structure classes as hard as I can. The rest of my classes were moderate. The second day of class I would always have 10% less enrollment. The MFT, Major Field Test in computer science is a good measure of the graduating students national standing.
 
  • Like
Likes aaroman
  • #12
An oldie but goldie that seems to apply well here too (using s/standard/programing language/gi):

standards.png
 
  • Haha
  • Like
Likes aaroman, Keith_McClary, DaveE and 4 others
  • #13
To reinforce some earlier posters. While studying for a Computer Science major it should not be uncommon to ask your students to define at least one or a few computer languages (also
file systems, process scheduling algorithms, etc) as a teaching tool. The success rate in the wider world is abysmal.
 
  • #14
Physics4Funn said:
That's the difference between a coder and a computer scientist.
Don't forget that "coding" used to mean punching up the cards/tape, not "programming". I suspect the modern definition is a jibe at the cut'n'paste section of the industry.
 
  • #15
hmmm27 said:
Don't forget that "coding" used to mean punching up the cards/tape, not "programming". I suspect the modern definition is a jibe at the cut'n'paste section of the industry.
I don't think so. I've done programming using a keypunch machine, and I've done coding on a computer screen. IMO, the two terms are more-or-less synonymous, with nothing to do with cut-and-paste.
 
  • Like
Likes .Scott, Klystron and jedishrfu
  • #17
Physics4Funn said:
Lots of good information on this thread. As someone who was a computer science professor for a few years, students mostly want the language that is currently most popular.

Interesting video on evolving popularity of computer languages:

 
  • Like
Likes BvU, Tom.G, Wrichik Basu and 2 others
  • #18
WOW! That is one cool video. Thanks for posting.
 
  • Like
Likes jedishrfu
  • #19
Mark44 said:
I don't think so. I've done programming using a keypunch machine, and I've done coding on a computer screen. IMO, the two terms are more-or-less synonymous, with nothing to do with cut-and-paste.
My bad ; my brain tends to drop bits these days.

My point was that you actually didn't have to be able to type in order to be a programmer.
 
  • Haha
Likes jedishrfu
  • #20
hmmm27 said:
My point was that you actually didn't have to be able to type in order to be a programmer.
Punching holes in cards was done with a keypunch machine, which had a standard typewriter keyboard, so of course you had to type in order to write programs. I took two programming classes, the first back in '72 and another in '80. Both used computer systems that took IBM (or Hollerith) cards as input. The way it worked was that several program decks ("jobs") were collected, and fed into a card reader, which then transcribed the contents on to a big tape reel. The tape real was then mounted on the mainframe, and the jobs were compiled and run.

Here's a picture of a keypunch machine. The ones I used looked pretty much like this. BTW, the first class used PL/C, which seems to have gone by the wayside, but the second class used Fortran 77, which in newer versions is still very much alive.

keypunch-machine.jpg
 
  • Like
Likes BvU, aaroman and Wrichik Basu
  • #21
Oh, I just had a flashback to an argument I (EE) had with the CS guy for a real world HW product UI/Controller. He wanted to use Smalltalk on OS2, because it was cool at the time and he hated Microsoft. I wanted ANSI C/C++ on Windows, because I wanted a good supportable product. He won, then the product died an early death.
 
  • Haha
Likes jedishrfu
  • #22
hmmm27 said:
My point was that you actually didn't have to be able to type in order to be a programmer.
HUH ? Perhaps you mean that you don't need to be a touch typist but surely you can't really believe that you can, or ever could, write computer code without typing?
 
  • Haha
Likes jedishrfu
  • #23
phinds said:
HUH ? Perhaps you mean that you don't need to be a touch typist but surely you can't really believe that you can write computer code without typing?
Hunt'n'peck with lots of copy'n'paste.

Perhaps filling out these was "coding"?
FortranCodingForm.png
 
  • Like
Likes cobalt124
  • #24
jedishrfu said:
Java for everyday work via Netbeans IDE solid environment, good IDE, rather boring now
Back in school, we learned Java for four years. In the last year, we had to submit a huge project consisting of thirty programs (I submitted a total of 310 pages). In school, we used BlueJ, which was great because other schools were using Notepad. At home, I wrote all my programs in NetBeans. It was a wonderful experience back then - I could easily look at the documentation, and sometimes even peek at the full source code itself.

When I joined college, I wrote my first GUI application in Java Swing using NetBeans. The GUI was not complicated, but the programming was, because I had to work with date and time, and native Java, as we know, does not provide a date picker or time picker like Android. So I had to design all that using drop down menus. It was a tedious task, but during that project, I learned to use Java 8 Time libraries.

Then I got interested in Android. It took some time to actually start programming in Android, because our 32-bit desktop was unable to run Android Studio, and I had to wait for my laptop. Soon it was time to learn how to manage a project with Gradle. Especially when one is using third party libraries, Gradle is the way to go because it is too time-consuming (albeit not impossible) to manage libraries manually. And then I found how easy coding was with the IDEs released by JetBrains - type one or two letters and you get a whole list of suggestions. It really reduced the coding time. Then I discovered Intellij IDEA. Although I don't do too much coding in native Java nowadays, if I have to make an application, I always look up to IntelliJ.

I have plans to learn JavaFX soon, and Gradle will come handy because JavaFX libraries are no longer packaged with the JDK nowadays.
 
  • #25
The memories you all bring back. I remember transcoding some handwritten notes to punch card sheets so I could print them out via the line printer and on microfiche.

The keypunchers really liked the change of pace as typing words was easier than typing cryptic programming code (depending on the language of course) because they didn't need to worry as much about accuracy. It was govt work aka personal project and they helped with the bulk typing work.

The lead keypuncher was amazingly fast and would routinely overload her machine. One day a salesman set up a new machine that he said was a state of the art keypunch machine guaranteed to handle any speed. She sat down to test it and within seconds she had to wait while the machine struggled to finish punching the cards she entered. The salesman was somewhat embarrassed.

I also remember not being a touch typist but yet typing with the classic two-finger pecking and later naturally using a few more fingers in my own typing style. One programmer was very fast on the teletype machine due to his prior job of being a court stenographer. It was a time when anyone with the interest and aptitude could be hired as a programmer and trained by the company.
 
Last edited:
  • Like
Likes Keith_McClary and Wrichik Basu
  • #26
phinds said:
HUH ? Perhaps you mean that you don't need to be a touch typist but surely you can't really believe that you can, or ever could, write computer code without typing?

I wrote my first programs (which used Fortran) without typing. I posted the following in a restrict forum, so I'll share it more widely here.

My "back in the day" story.

Fortran was my first progamming language, which I leaned in two high school computer science courses from '76 to '78. My high school teacher was a CS grad from the University of Waterloo, so we did some good stuff, e.g., introductory numerical methods.

Running programs was quite an experience. "Back in the day", my high school didn't have computers. We penciled in bubbles on computer cards, and then sent out our cards by Geyhound bus to the nearest university, which would run our programs and send the cards and hard-copy results back by bus. Each program had an effective run-time of two to four days! After three days, you would find out that your program hadn't even run, because you penciled in a wrong bubble, causing a fatal syntax error. Result "Execution suppressed."!

Our teacher, Mr. Fennell, managed things well, making sure that we were working on multiple projects and getting results back every day. A very positive experience for (the then young) me.
 
  • Like
  • Haha
Likes Wrichik Basu, jedishrfu and Keith_McClary
  • #27
My early days circa 1970 were very similar but we met only once a week for about 90 minutes at the GE Computer Center Explorer Scout meeting. I was luckier than most as I got two to three runs in on a night with the help of one of the off-shift computer supervisors. Others would only get one run. We all fought for the keypunch machine or used the sheets for the keypunchers to type it for us. I would get there early do my edits and get a run done.

In the end, my project was to plot some math functions using the line printer. The frustration was that while it worked my plots were always on a field of zeros. Years later as a trained programmer for GE, I learned that I needed to initialize the printed array with spaces.

The one positive of the experience was that I was immediately hired right out of college by GE as a poster child for the success of the Explorer Post Program as I was the very first graduate of the program that came back to the fold.
 
  • #28
Also in the 70s, I once caught a malfunctioning punch card reader. The cards did not always come out in the same order as they went in.

Imagine the angst of those poor programmers who were trying to debug their programs on that machine. Every time they made a new run, the symptoms would worsen. 😩
 
  • Like
Likes jedishrfu
  • #29
That’s a very sad story. I think though for source code they would have gotten a listing that would show card order and possibly a compile error depending on the physical malfunction. But for object decks maybe a checksum would catch it.
 
  • Like
Likes Klystron
  • #30
anorlunda said:
The cards did not always come out in the same order as they went in.
In the 70's it was already standard (CYA) practice to use a marking pen to put an angled stripe on the edge of a card deck. Priceless when you dropped an inch thick deck!

Patches triggered a different color stripe, not unusual to see a deck with a
4- or 5-stripe rainbow.

Those punch card coding sheets that @Keith_McClary showed in post #23
(https://www.physicsforums.com/posts/6392751)
Came in handy in the late 70s as layout sheets for the 80x24 CRT displays of the time. I think I still have a pad of them around here ... somewhere.

Going back to @Mark44 s post 20
(https://www.physicsforums.com/posts/6392734)
those keypunch machines look like the IBM-029 machines I used when learning Fortran. Of course there were more students than machines, so there were a few of the predecessor IBM-026 keypunches also available. That was a good incentive to get to class early, otherwise you were stuck on the ornery, clumsy -026's!

Back in 1975 my wife and I build an ALTAIR 8800 computer from a kit. Eight bit 8080 CPU with 2MHz clock, program entry by flipping switches on the front panel. Last year, just for the heck of it and after replacing all the filter caps, we fired it up and it worked!

Tom

(have yet to figure out how to copy the pictures from a post)
 
  • Wow
Likes Keith_McClary
  • #31
Keith_McClary said:
Hunt'n'peck with lots of copy'n'paste.

Perhaps filling out these was "coding"?
View attachment 269464
Well, filling those out was just passing the typing on to someone else. SOMEONE has to do the typing, unless you are talking about one of the really early hobby machines that could be programmed through switches on the front panel.
 
  • #32
Tom.G said:
In the 70's it was already standard (CYA) practice to use a marking pen to put an angled stripe on the edge of a card deck. Priceless when you dropped an inch thick deck!
You bet'cher bippy. Saved my butt more than once with that (I'm talking about decks a foot long or more in a carrier tray.
 
  • Like
Likes Keith_McClary
  • #33
I have nothing special to add to a lot of the real good perspectives here on a very interesting question. Langauge is "organic" and I think best understood in the context of biology and the theory of evolution, rather than a perspective of "programmers" and "language aficionados". Not that the pros don't have great insights and aren't worth reading on this issue. But like life: We learned a lot more from the scientists than from mythology or religion. That does not mean one is wise to "kill the old gods" as Nietzsche indicated had been done and was leaving a fertile ground for the "new man" like Hitler, or Mao, or Stalin. Jettisoning the Pope does not mean you are now in Paradise!
 
  • #34
That video used a strange measurement for "popularity": ' popularity is defined by percentage
of programmers with either proficiency in specific language or currently learning/mastering one. '

That is not what I would use. Much better to look at one of: 1) installed code base (needing maintenance), or 2) job ads (what people are hiring for)

Certain languages such as Ruby are wildly hyped and their reputation guarded but, being unstandardized and not having high performance, they are useless for a lot of realworld high performance applications. I think by using a different criterion for "popularity", those numbers would look quite different.

If you are a student and have access to the online library of a good university (I am retired and no longer have such access), take a look at the Gartner Research reports on language popularity. I'm certain you'll see very different results that the video above gave, and Gartner has been the go-to source for such information for decades now.
 
  • #35
Ruby and Java have one enormous strength. There is usually one best way to do something, and while it's possible to deviate from this, it's definitely swimming upstream. Makes your code much more understandable to others. Ruby and Java also have one enormous weakness. There is usually one best way to do something, and while it's possible to deviate from this, it's definitely swimming upstream.
 
  • Like
  • Haha
Likes pbuk and Tom.G

Similar threads

  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Programming and Computer Science
Replies
15
Views
1K
  • Programming and Computer Science
Replies
29
Views
3K
  • STEM Academic Advising
Replies
4
Views
1K
Replies
1
Views
811
Replies
10
Views
2K
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
  • Programming and Computer Science
Replies
5
Views
4K
Back
Top