Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights Computer Language Primer - Part 1 - Comments

  1. Feb 9, 2017 #1

    phinds

    User Avatar
    Gold Member
    2016 Award

  2. jcsd
  3. Feb 9, 2017 #2

    jim mcnamara

    User Avatar

    Staff: Mentor

    Very well done. Every language discussion thread - 'what language should I use' should have a reference back this article. Too many statements in those threads are off target. Because posters have no clue about origins.

    Typo in the Markup language section: "Markup language are"

    Thanks for a good article.
     
  4. Feb 9, 2017 #3

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    @phinds , thanks for the trip down memory lane. That was fun to read.

    I too started in the machine language era. My first big project was a training simulator (think of a flight simulator) done in binary on a computer that had no keyboard, no printer. All coding and debugging was done in binary with those lights and switches. I had to learn to read floating point numbers in binary.

    Then there was the joy of the most powerful debugging technique of all. Namely, the hex dump (or octal dump) of the entire memory printed on paper. That captured the full state of the code & data and there was no bug that could not be found if you just spent enough tedium to find it.

    But even that paled compared the to generation just before my time. They had to work on "stripped program" machines. Instructions were read from the drum one-at-a-time and executed. To do a branch might mean (worst case) waiting for one full revolution of the drum before the next instruction could be executed. To program them, you not only had to decide what the next instruction would be, but also where on the drum it was stored. Choices made an enormous difference in speed of execution. My boss did a complete boiler control system on such a computer that had only 3x24 bit registers, and zero RAM memory. And he had stories about the generation before him that programmed the IBM 650 using plugboards.

    In boating we say, "No matter how big your boat, someone else has a bigger one." In this field we can say, "No matter how crusty your curmudgeon credentials, there's always an older crustier guy somewhere."
     
  5. Feb 9, 2017 #4

    phinds

    User Avatar
    Gold Member
    2016 Award

    Thanks Jim. Several people gave me some feedback and @Mark44 went through it line by line and found lots of my typos and poor grammar. The one you found is one I snuck in after he looked at it :smile:
     
  6. Feb 9, 2017 #5

    phinds

    User Avatar
    Gold Member
    2016 Award

    Oh, I didn't even get started on the early days. No mention of punched card decks, teletypes, paper tape machines and huge clean rooms with white-coated machine operators to say nothing of ROM burners for writing your own BIOS in the early PC days, and on and on. I could have done a LONG trip down memory lane without really telling anyone much of any practical use for today's world, but I resisted the urge :smile:

    My favorite "log cabin story" is this: In one of my early mini-computer jobs, well after I had started working on mainframes, it was a paper tape I/O machine. To do a full cycle, you had to (and I'm probably leaving out a step or two, and ALL of this is using a very slow paper tape machine, taking a full afternoon easily for these steps) load the editor from paper tape, use it to load your source paper tape, use the teletype to do the edit, output a new source tape, load the assembler, use it to load the modified source tape and output an object tape, load the loader, use it to load the object tape, run the program, realize you had made another code mistake, go out and get drunk.
     
    Last edited: Feb 9, 2017
  7. Feb 9, 2017 #6

    rcgldr

    User Avatar
    Homework Helper

    Mainframe assembler's included fairly advanced macro capability going back to the 1960s'. In the case of IBM mainframes, there were macro functions that operated on IBM database types, such as ISAM (index sequential access method), which was part of the reason for the strange mix of assembly and Cobol on the same programs, which due to legacy issues, still exists somewhat today. Microsoft assemblers and other assemblers for mini / micro computers had/have macros, and MASM 6.11 includes some higher level language concepts with dot directives like .if .else .endif .while ... .

    Self-modifying code - this was utilized on some older computers, like the CDC 3000 series which included a store instruction that only modified the address field of another instruction, essentially turning an instruction into an instruction + modifiable pointer. IBM 360 type mainframes use an instruction similar in concept "EX" (execute) to override an otherwise fixed operand field on the next instruction, such as changing the number of bytes to move on a move character instruction (MVC).

    Event driven programming is part of most pre-emptive operating systems and applications, and time sharing systems / applications, again dating back to the 1960s.

    Another advancement in programming were tool sets that generated code. Prototyper for the Macintosh was an early example. The developer would design a user interface using drag and drop based tool set, and Prototyper would generate the code, where the developer would then add code to in order to create an application. Visual Studio includes this feature, and in the case of Visual Basic, includes the ability to generate code for charts and graphs.
     
    Last edited: Feb 9, 2017
  8. Feb 9, 2017 #7

    phinds

    User Avatar
    Gold Member
    2016 Award

    Yes, there are a TON of such fairly obscure points that I could have brought in, but the article was too long as is and that level of detail was just out of scope.
     
  9. Feb 9, 2017 #8
    Great part 1 phinds!
     
  10. Feb 9, 2017 #9

    phinds

    User Avatar
    Gold Member
    2016 Award

    AAACCCKKKKK !!! That reminds me that now I have to write part 2. Damn !
     
  11. Feb 9, 2017 #10

    Mark44

    Staff: Mentor

    I thought it was spelled ACK...

    (As opposed to NAK)
     
  12. Feb 9, 2017 #11

    QuantumQuest

    User Avatar
    Gold Member

    Very well done, thanks phinds for this insight! Judging from this part 1, it gives a very good general picture touching upon many important things.
     
  13. Feb 9, 2017 #12

    phinds

    User Avatar
    Gold Member
    2016 Award

    No, that's a minor ACK. MIne was a heartfelt, major ACK, generally written as AAACCCKKKKK !!!
     
  14. Feb 9, 2017 #13
    No choice.
    You have sent an ASCII 06.
    Full Duplex mode?
    ASCII 07 will get the attention for part 2.
     
  15. Feb 9, 2017 #14

    phinds

    User Avatar
    Gold Member
    2016 Award

    Yeah but part 2 is likely to be the alternate interpretation of the acronym for ASCII 08
     
  16. Feb 9, 2017 #15

    rcgldr

    User Avatar
    Homework Helper

    Some comments:

    No mention of plugboard programming.

    http://en.wikipedia.org/wiki/Plugboard

    "8080 instruction guide (the CPU on which the first IBM PCs were based)." The first IBM PC's were based on 8088, same instruction set as 8086, but only an 8 bit data bus. The Intel 8080, 8085, and Zilog Z80 were popular in the pre-PC based systems, such as S-100 bus systems, Altair 8000, TRS (Tandy Radio Shack) 80, ..., mostly running CP/M (Altair also had it's own DOS). There was some overlap as the PC's didn't sell well until the XT and later AT.

    APL and Basic are interpretive languages. The section title is high level language, so perhaps it could mention these languages include compiled and interpretive languages.

    RPG and RPG II high level langauges, popular for a while for conversion from plugboard based systems into standard computer systems. Similar to plugboard programming, input to output field operations were described, but there was no determined ordering of those operations. In theory, these operations could be performed in parallel.
     
  17. Feb 9, 2017 #16

    phinds

    User Avatar
    Gold Member
    2016 Award

    Nuts. You are right of course. I spent so much time programming the 8080 on CPM systems that I forgot that IBM went with the 8088. I'll make a change. Thanks.
     
  18. Feb 9, 2017 #17

    phinds

    User Avatar
    Gold Member
    2016 Award

    The number of things that I COULD have brought in, that have little or no relevance to modern computing, would have swamped the whole article.
     
  19. Feb 9, 2017 #18
    This looks like an interesting topic. I hope there will be discussion about which languages are most widely used in the math and science community, since this is after all a physics forum.

    One thing I noticed is that although you mention LISP, you did not mention the topic of languages for artificial intelligence. I did not see any mention of Prolog, which was the main language for the famous 5th Generation Project in Japan.

    It could also be useful to discuss functional programming languages or functional programming techniques in general.

    I think it would be interesting to see the latest figures on which are the most popular languages, and why they are so popular. The last time I looked the top three were Java, C, and C++. But that's just one survey, and no doubt some surveys come up with a different result.

    Since you mention object-oriented programming and C++, how about also mentioning Simula, the language that started it all, and Smalltalk, which took OOP to what some consider an absurd level.

    Finally, I do not see any mention of Pascal, Modula, and Oberon. The work on this family of languages by Prof. Wirth is one of the greatest accomplishments in the history of computer languages.

    In any case, I look forward to the discussion.
     
  20. Feb 10, 2017 #19
    I think the level you kept it at was excellent. Not easy to do.

    Meanwhile I have a couple of possibly dumb questions about following Insights articles; I will ask them here since Insights seems to be separate from the main forum & I didn't find any help articles in the sitemap for Insights.

    1) How in heck does one "vote" on an article? I see at the top of the page "You must sign in to vote"; but (a) I am already signed into the forum, and (b) clicking Register just brings me back to the forum, so (c) eh??

    2) I wish there were a way built into Insights to bookmark or "favorite" articles, just as one can watch forum threads. I can follow the comments thread for an article, which is almost as good; but "favoriting" articles would be a nice feature.
     
  21. Feb 10, 2017 #20

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Computer Language Primer - Part 1 - Comments
  1. Computer languages (Replies: 6)

  2. Computer language (Replies: 4)

Loading...