Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights Computer Language Primer - Part 1 - Comments

  1. Feb 9, 2017 #1

    phinds

    User Avatar
    Gold Member

  2. jcsd
  3. Feb 9, 2017 #2

    jim mcnamara

    User Avatar

    Staff: Mentor

    Very well done. Every language discussion thread - 'what language should I use' should have a reference back this article. Too many statements in those threads are off target. Because posters have no clue about origins.

    Typo in the Markup language section: "Markup language are"

    Thanks for a good article.
     
  4. Feb 9, 2017 #3

    anorlunda

    Staff: Mentor

    @phinds , thanks for the trip down memory lane. That was fun to read.

    I too started in the machine language era. My first big project was a training simulator (think of a flight simulator) done in binary on a computer that had no keyboard, no printer. All coding and debugging was done in binary with those lights and switches. I had to learn to read floating point numbers in binary.

    Then there was the joy of the most powerful debugging technique of all. Namely, the hex dump (or octal dump) of the entire memory printed on paper. That captured the full state of the code & data and there was no bug that could not be found if you just spent enough tedium to find it.

    But even that paled compared the to generation just before my time. They had to work on "stripped program" machines. Instructions were read from the drum one-at-a-time and executed. To do a branch might mean (worst case) waiting for one full revolution of the drum before the next instruction could be executed. To program them, you not only had to decide what the next instruction would be, but also where on the drum it was stored. Choices made an enormous difference in speed of execution. My boss did a complete boiler control system on such a computer that had only 3x24 bit registers, and zero RAM memory. And he had stories about the generation before him that programmed the IBM 650 using plugboards.

    In boating we say, "No matter how big your boat, someone else has a bigger one." In this field we can say, "No matter how crusty your curmudgeon credentials, there's always an older crustier guy somewhere."
     
  5. Feb 9, 2017 #4

    phinds

    User Avatar
    Gold Member

    Thanks Jim. Several people gave me some feedback and @Mark44 went through it line by line and found lots of my typos and poor grammar. The one you found is one I snuck in after he looked at it :smile:
     
  6. Feb 9, 2017 #5

    phinds

    User Avatar
    Gold Member

    Oh, I didn't even get started on the early days. No mention of punched card decks, teletypes, paper tape machines and huge clean rooms with white-coated machine operators to say nothing of ROM burners for writing your own BIOS in the early PC days, and on and on. I could have done a LONG trip down memory lane without really telling anyone much of any practical use for today's world, but I resisted the urge :smile:

    My favorite "log cabin story" is this: In one of my early mini-computer jobs, well after I had started working on mainframes, it was a paper tape I/O machine. To do a full cycle, you had to (and I'm probably leaving out a step or two, and ALL of this is using a very slow paper tape machine, taking a full afternoon easily for these steps) load the editor from paper tape, use it to load your source paper tape, use the teletype to do the edit, output a new source tape, load the assembler, use it to load the modified source tape and output an object tape, load the loader, use it to load the object tape, run the program, realize you had made another code mistake, go out and get drunk.
     
    Last edited: Feb 9, 2017
  7. Feb 9, 2017 #6

    rcgldr

    User Avatar
    Homework Helper

    Mainframe assembler's included fairly advanced macro capability going back to the 1960s'. In the case of IBM mainframes, there were macro functions that operated on IBM database types, such as ISAM (index sequential access method), which was part of the reason for the strange mix of assembly and Cobol on the same programs, which due to legacy issues, still exists somewhat today. Microsoft assemblers and other assemblers for mini / micro computers had/have macros, and MASM 6.11 includes some higher level language concepts with dot directives like .if .else .endif .while ... .

    Self-modifying code - this was utilized on some older computers, like the CDC 3000 series which included a store instruction that only modified the address field of another instruction, essentially turning an instruction into an instruction + modifiable pointer. IBM 360 type mainframes use an instruction similar in concept "EX" (execute) to override an otherwise fixed operand field on the next instruction, such as changing the number of bytes to move on a move character instruction (MVC).

    Event driven programming is part of most pre-emptive operating systems and applications, and time sharing systems / applications, again dating back to the 1960s.

    Another advancement in programming were tool sets that generated code. Prototyper for the Macintosh was an early example. The developer would design a user interface using drag and drop based tool set, and Prototyper would generate the code, where the developer would then add code to in order to create an application. Visual Studio includes this feature, and in the case of Visual Basic, includes the ability to generate code for charts and graphs.
     
    Last edited: Feb 9, 2017
  8. Feb 9, 2017 #7

    phinds

    User Avatar
    Gold Member

    Yes, there are a TON of such fairly obscure points that I could have brought in, but the article was too long as is and that level of detail was just out of scope.
     
  9. Feb 9, 2017 #8
    Great part 1 phinds!
     
  10. Feb 9, 2017 #9

    phinds

    User Avatar
    Gold Member

    AAACCCKKKKK !!! That reminds me that now I have to write part 2. Damn !
     
  11. Feb 9, 2017 #10

    Mark44

    Staff: Mentor

    I thought it was spelled ACK...

    (As opposed to NAK)
     
  12. Feb 9, 2017 #11

    QuantumQuest

    User Avatar
    Science Advisor
    Gold Member

    Very well done, thanks phinds for this insight! Judging from this part 1, it gives a very good general picture touching upon many important things.
     
  13. Feb 9, 2017 #12

    phinds

    User Avatar
    Gold Member

    No, that's a minor ACK. MIne was a heartfelt, major ACK, generally written as AAACCCKKKKK !!!
     
  14. Feb 9, 2017 #13

    256bits

    User Avatar
    Gold Member

    No choice.
    You have sent an ASCII 06.
    Full Duplex mode?
    ASCII 07 will get the attention for part 2.
     
  15. Feb 9, 2017 #14

    phinds

    User Avatar
    Gold Member

    Yeah but part 2 is likely to be the alternate interpretation of the acronym for ASCII 08
     
  16. Feb 9, 2017 #15

    rcgldr

    User Avatar
    Homework Helper

    Some comments:

    No mention of plugboard programming.

    http://en.wikipedia.org/wiki/Plugboard

    "8080 instruction guide (the CPU on which the first IBM PCs were based)." The first IBM PC's were based on 8088, same instruction set as 8086, but only an 8 bit data bus. The Intel 8080, 8085, and Zilog Z80 were popular in the pre-PC based systems, such as S-100 bus systems, Altair 8000, TRS (Tandy Radio Shack) 80, ..., mostly running CP/M (Altair also had it's own DOS). There was some overlap as the PC's didn't sell well until the XT and later AT.

    APL and Basic are interpretive languages. The section title is high level language, so perhaps it could mention these languages include compiled and interpretive languages.

    RPG and RPG II high level langauges, popular for a while for conversion from plugboard based systems into standard computer systems. Similar to plugboard programming, input to output field operations were described, but there was no determined ordering of those operations. In theory, these operations could be performed in parallel.
     
  17. Feb 9, 2017 #16

    phinds

    User Avatar
    Gold Member

    Nuts. You are right of course. I spent so much time programming the 8080 on CPM systems that I forgot that IBM went with the 8088. I'll make a change. Thanks.
     
  18. Feb 9, 2017 #17

    phinds

    User Avatar
    Gold Member

    The number of things that I COULD have brought in, that have little or no relevance to modern computing, would have swamped the whole article.
     
  19. Feb 9, 2017 #18
    This looks like an interesting topic. I hope there will be discussion about which languages are most widely used in the math and science community, since this is after all a physics forum.

    One thing I noticed is that although you mention LISP, you did not mention the topic of languages for artificial intelligence. I did not see any mention of Prolog, which was the main language for the famous 5th Generation Project in Japan.

    It could also be useful to discuss functional programming languages or functional programming techniques in general.

    I think it would be interesting to see the latest figures on which are the most popular languages, and why they are so popular. The last time I looked the top three were Java, C, and C++. But that's just one survey, and no doubt some surveys come up with a different result.

    Since you mention object-oriented programming and C++, how about also mentioning Simula, the language that started it all, and Smalltalk, which took OOP to what some consider an absurd level.

    Finally, I do not see any mention of Pascal, Modula, and Oberon. The work on this family of languages by Prof. Wirth is one of the greatest accomplishments in the history of computer languages.

    In any case, I look forward to the discussion.
     
  20. Feb 10, 2017 #19
    I think the level you kept it at was excellent. Not easy to do.

    Meanwhile I have a couple of possibly dumb questions about following Insights articles; I will ask them here since Insights seems to be separate from the main forum & I didn't find any help articles in the sitemap for Insights.

    1) How in heck does one "vote" on an article? I see at the top of the page "You must sign in to vote"; but (a) I am already signed into the forum, and (b) clicking Register just brings me back to the forum, so (c) eh??

    2) I wish there were a way built into Insights to bookmark or "favorite" articles, just as one can watch forum threads. I can follow the comments thread for an article, which is almost as good; but "favoriting" articles would be a nice feature.
     
  21. Feb 10, 2017 #20

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.
     
  22. Feb 10, 2017 #21

    DrClaude

    User Avatar

    Staff: Mentor

    Nice article, @phinds!

    I would like to point out that some interpreted languages, such as MATLAB, have move to the JIT (just-in-time) model, where some parts are compiled instead of simply being interpreted.

    Also, Fortran is still in use not only because of legacy code. First, there are older physicists like me who never got the hang of C++ or python. Second, many physical problems are more simply translated to Fortran than other compiled languages, making development faster.
     
  23. Feb 10, 2017 #22
    Others more knowledgable than me will no doubt reply, but I get the sense that scripting languages are a subset of interpreted. So something like Python gets called both, but Java is only interpreted (compiled into bytecode, as Python is also) and not scripted.

    https://en.wikipedia.org/wiki/Scripting_language
     
  24. Feb 10, 2017 #23

    jedishrfu

    Staff: Mentor

    Well done Phinds!

    One word of clarification on the history of markup is that while HTML is considered to be the first markup language it was in fact adapted from the SGML(1981-1986) standard of Charles Goldfarb by Sir Tim Berners-Lee:

    https://en.wikipedia.org/wiki/Standard_Generalized_Markup_Language

    And the SGML (1981-1986) standard was in fact an outgrowth of GML(1969) found in an IBM product called Bookmaster, again developed by Charles Goldfarb who was trying to make it easier to use SCRIPT(1968) a lower level document formatting language:

    https://en.wikipedia.org/wiki/IBM_Generalized_Markup_Language

    https://en.wikipedia.org/wiki/SCRIPT_(markup)

    in between the time of GML(1969) and SGML(1981-1986), Brian Reid developed SCRIBE(1980) for his doctoral dissertation and both SCRIBE(1980) and SGML(1981-1986) were presented at the same conference (1981). Scribe is considered to be the first to separate presentation from content which is the basis of markup:

    https://en.wikipedia.org/wiki/Scribe_(markup_language)

    and then in 1981, Richard Stallman developed TEXINFO(1981) because SCRIBE(1980) became a proprietary language:

    https://en.wikipedia.org/wiki/Texinfo

    these early markup languages , GML(1969), SGML(1981), SCRIBE(1980) and TEXINFO were the first to separate presentation from content:

    Before that there were the page formatting language of SCRIPT(1968) and SCRIPT’s predecessor TYPSET/RUNOFF (1964):

    https://en.wikipedia.org/wiki/TYPSET_and_RUNOFF

    Runoof was so named from "I'll run off a copy for you."

    All of these languages derived from printer control codes (1958?):

    https://en.wikipedia.org/wiki/ASA_carriage_control_characters https://en.wikipedia.org/wiki/IBM_Machine_Code_Printer_Control_Characters

    So basically the evolution was:
    - program controlled printer control (1958)
    - report formatting via Runoff (1964)
    - higher level page formatting macros Script (1968)
    - intent based document formatting GML (1969)
    - separation of presentation from content via SCRIBE(1981)
    - standardized document formatting SGML (1981 finalized 1986)
    - web document formatting HTML (1993)
    - structured data formatting XML (1996)
    - markdown style John Gruber and Aaron Schwartz (2004)

    https://en.wikipedia.org/wiki/Comparison_of_document_markup_languages

    and back to pencil and paper...

    A Printer Code Story
    ------------------------

    Lastly, the printer codes were always an embarrassing nightmare for a newbie Fortran programmer who would write throughly elegant program that generated a table of numbers and columnized them to save paper only to find he’s printed a 1 in column 1 and receives a box or two of fanfold paper with a note from the printer operator not to do it again.

    I’m sure I’ve left some history out here.

    - Jedi
     
    Last edited: Feb 10, 2017
  25. Feb 10, 2017 #24

    phinds

    User Avatar
    Gold Member

    This was NOT intended as a thoroughly exhaustive discourse. If you look at the wikipedia list of languages you'll see that I left out more than I put in but that was deliberate.

    And yes I could have written thousands of pages on all aspects of computing. I chose not to.

    See above

    Pascal is listed but not discussed. See above



    Basically, I think most people see "scripting" in two ways. First is, for example, BASIC which is an interpreted computer language and second is, for example, Perl, which is a command language. The two are quite different but I'm not going to get into that. It's easy to find on the internet.

    NUTS, again. Yes, you are correct. I actually found that all out AFTER I had done the "final" edit and just could not stand the thought of looking at the article for the 800th time so I left it in. I'll make a correction. Thanks.
     
  26. Feb 10, 2017 #25
    You can read the whole Wikipedia articles on "scripting language" and "interpreted language" but this does not really provide a clear answer to your question. In fact, I think there is no definition that would clearly separate scripting from non-scripting languages or interpreted from non-interpreted languages. Here are a couple of quotes from Wikipedia.

    "A scripting or script language is a programming language that supports scripts; programs written for a special run-time environment that automate the execution of tasks that could alternatively be executed one-by-one by a human operator."

    "The terms interpreted language and compiled language are not well defined because, in theory, any programming language can be either interpreted or compiled."

    Speaking of scripting languages, consider Lua, which is the most widely used scripting language for game development. Within a development team, some programmers may only need to work at the Lua script level, without ever needing to modify and recompile the core engine. For example, how a certain game character behaves might be controlled by a Lua script. This sort of scripting could also be made accessible to the end users. But Lua is not an interpreted language.

    LISP is not a scripting language. On the other hand, LISP is an interpreted language, but it can also be compiled. You might spend most of your development time working in the interpreter, but once some code is nailed down you might compile it for greater speed, or because you are releasing a compiled version for use by others.

    Now perhaps someone will jump in and say "LISP can in fact be a scripting language,", etc. I would not respond. ;)
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted