Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

C/++/# Modern programming languages causing cognitive dissonance

  1. Nov 17, 2015 #1
    Anyone else experience extreme cognitive dissonance when learning modern programming languages? If so, have you visited a pschyiatrist about it?

    1) Automatic type deduction. The idea that I can be too lazy to figure out the exact type returned from a function, and put var or auto in its place, feels wrong, like committing a crime.

    2) Lambas. I'm used to hours spent going, "Ok, great, I could use a function here, so I'll go to my namespace Helpers and implement and name the function inside there." There is something unnerving about the idea that I can write unnamed functions on-the-fly. There has to be a catch ...
     
  2. jcsd
  3. Nov 17, 2015 #2

    jedishrfu

    Staff: Mentor

    I've not experienced such an issue and I've been programming for literally for decades. Computers are very iteral especially if you deal with assembler language. Higher level languages provide ease-of-use features and organizational features that allow programmers to make larger more complex applications. When I started out the primary languages were Fortran/Cobol, Macro assembler and job control language and a bit later timesharing Basic. We chose the language based on the task to be done and sometimes based on the efficiency needed. As an example, complex math applications would naturally use Fortran over Cobol or Assembler and business applications such as payroll management would prefer Cobol. Assembler was chosen when extreme efficiency was needed or when low level operations such as reading a poorly formatted billing tape was needed.

    From there, folks developed structured programming and then later Object Oriented programming all in an attempt to make programming more organized. Batch runs were replaced by timesharing, Job Control language (JCL) was replaced by script languages and other languages evolved into what you see today.

    The only dissonance I see is that the bar has been raised for programming skills where some folks pick it up real fast and others languish. Basic was very easy to pick up whereas Java is decidedly more difficult. However, tools like Processing make it a lot easier to learn Java.
     
  4. Nov 17, 2015 #3
    That the use of auto in C++ indicates laziness is simply a misconception. Please read:

    http://herbsutter.com/2013/06/07/gotw-92-solution-auto-variables-part-1/
    http://herbsutter.com/2013/06/13/gotw-93-solution-auto-variables-part-2/
    http://herbsutter.com/2013/08/12/gotw-94-solution-aaa-style-almost-always-auto/

    It's hard to tell what your problem with anonymous functions is. Are you saying you would rather write a normal function or functor than a lambda even in the simplest scenarios with std::find_if? Why?
     
  5. Nov 17, 2015 #4

    Mark44

    Staff: Mentor

    That's lambda, named after the Greek letter ##\lambda## (lambda).
    Lambda functions came out of the LISP programming language (I believe) and are now featured in C#, Python, and a number of other languages. One of the primary uses of anonymous functions, which lambda functions are, is in callbacks that are to be executed following some action. All the programmer cares about is that the action be performed in the proper order, such as an action that is to be performed following a button being pressed in a GUI. This blog post (https://pythonconquerstheuniverse.wordpress.com/2011/08/29/lambda_tutorial/) is specific to Python, but the point he makes is relevant to many other programming languages.
     
    Last edited: Nov 17, 2015
  6. Nov 17, 2015 #5

    rcgldr

    User Avatar
    Homework Helper

    Try learning APL (A Programming Language), with it's bazillion operators, almost all of them single greek letters. It was created back in the 1960's and still considered to be very high level.

    http://en.wikipedia.org/wiki/APL_(programming_language)

     
  7. Nov 17, 2015 #6

    DrClaude

    User Avatar

    Staff: Mentor

    Mind ← blown
     
  8. Nov 17, 2015 #7

    jedishrfu

    Staff: Mentor

    APL is quite cool. I don't recall it having that many operators. I do remember IBM using the operators in its mainframe assembler manuals to describe the function of an opcode.

    I also remember a humorous open session at a conference where Prof Iverson displayed his APL language and one of the GE engineers present ask an innocent question about his keyboard not having all those arcane symbols used in APL which prevented him from programming in the language.

    Iverson replied: You expect ME to design a language for YOUR keyboard, while the engineer slunk back into his seat embarrassed by the laughter of the audience.

    In truth, many programmers felt the same way (but were too chicken to ask) as APL was only available on IBM systems configured to use the 3270 terminal with APL keycaps. GE had a variant but you had to type the names of the operators as in rho for ##\rho##.

    One of the programs he demonstrated was a prime number sieve in 23 keystrokes that could print out all primes under 1,000,000. The catch was that APL allocated an array 1,000,000 by 1,000,000 internally and row reduced it to a 1,000,000 element vector and then further reduced to a vector of the prime numbers. The 1,000,000 element vector has counts for the number of factors each number had with primes having only 2 themselves and one.

    From that program you could see that the sparseness of the language meant you could use up a lot of memory resources for intermediate results. It also illustrated the write-once feature of the language.

    I think the website rosettacode.org has some further examples of APL programs for various tasks.
     
  9. Nov 17, 2015 #8

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You need to get used to this. Programming languages change over time. They go in and out of favor. Some languages just die while new languages pop up. I've tried to follow the advice set forth in "The Pragmatic Programmer", which is to learn a new programming language a year. (This was word-of-mouth advice long before that book came out.) This forces you to be mentally agile and flexible, and to keep current.

    With regard to older languages such as C, C++, C#, Fortran, you can always keep programming in a style born in (and best relegated to) the previous millennium. The problem with this is that you'll soon find yourself outpaced by others who embrace a more modern style. The updates to those languages since the end of the previous millennium play an important role in keeping those languages competitive with newer languages. To pick an extreme example, suppose Fortran was frozen at FORTRAN IV. "There's a big difference between mostly dead and all dead." Fortran would be all dead (as opposed to mostly dead) if the language hadn't been changed multiple times since the mid-1960s.

    You specifically wrote about automatic type deduction and lambda expressions. Supposed you have a map that maps strings to a shared pointer of some class template. Which would you rather type (or read):
    Code (C):
    const std::unordered_map<std::string,std::shared_ptr<some_template<typeA, typeB>>>::iterator&
        found_item = map.find (key);
    versus
    Code (C):
    const auto& found_item = map.find (key);
    Automated type deduction frees your mind to get on with the work that needs to be done.



    In C++, lambda expressions can be implemented as an anonymous type that encapsulates the capture arguments of the lambda expression, has a constructor that captures those capture arguments, and a function call operator that takes the arguments to the lambda. You could write that anonymous class by hand (now no longer anonymous) that does all that. In doing so, your one line lambda has now ballooned to a dozen or more lines of code. The lambda represents an order of magnitude reduction in lines of code.

    This is a huge win. Averaged out over time, the CMMI level 5 Space Shuttle flight software was written, tested, and maintained at the glacial pace of one line of code per person per day. A more modern metric for well-written and well-tested code is one line of code per person per hour. (That still sounds slow, but it isn't. That code has to be tested (the test code SLOCs don't count as delivered lines of code), documented (the documentation and briefings don't count as delivered lines of code), peer-reviewed (the reviews don't count, either), and delivered (the builds -- they also don't count.)) All of those things do however represent hours of people time, and that translates to dollars. The lambda expression represents an hour's worth of work, or roughly $100 using a standard metric of $100 per person per hour. The corresponding functor class represents a over a day's worth of work, or $1000. Multiply that over and over and you are getting into serious money.
     
  10. Nov 17, 2015 #9

    rcgldr

    User Avatar
    Homework Helper

    There was a keyboard layout sheet that most used:

    aplkeyb.jpg

    For some keyboards, you can/could order a set of APL keycap replacements. Some current APL providers also sell APL keyboards for PC.

    For PC console window usage, there are/were programs to change the screen font to APL characters. This isn't an issue for GUI based APL interpreters.

    APL has a feature similar to auto, variables are automatically type converted by math operations, like a integer vector will get converted into floating point if a floating point value (usually the result of some calculation) is stored in any member of the vector.
     
    Last edited: Nov 17, 2015
  11. Nov 17, 2015 #10

    jedishrfu

    Staff: Mentor

    When I started to play with APL, there were no PCs only terminals. Honeywell, in particular had a poor version of APL that used an ASCII terminal where you were limited to typing in the three letter Greek name for the operator. We did have the keyboard diagram but it didn't work well with a non IBM terminal.

    Once the PC came out, APL became available there but it was still no match for the mainframe version which was faster and had more virtual memory to play with.

    The modern day successors to APL is probably MATLAB and Julia which favor matrix operations like APL.
     
    Last edited: Nov 18, 2015
  12. Dec 3, 2015 #11
    OK, so what is the premier engineering programming language of today? I have written many engineering simulations in Fortran, I have also used C and C++ to control embedded controllers, and even some windows graphic programs. From that experience I don't think I would want to use C or C++ to code up an engineering simulation, graphics output OK but not the bulk of the number crunching needed in finite difference, of finite element methods.
    Aule Mar
    An old engineer
     
  13. Dec 3, 2015 #12

    jedishrfu

    Staff: Mentor

    My guess is that most engineers would use MATLAB for any engineering simulations.

    Why?

    Because they are encouraged to buy the student edition when they are in college and then later have their company buy the full version once they start working.

    After a while though they company may tire of MATLAB's cost and then will look for alternatives like Octave or a numerical Python distribution or more recently Julia. However none of these has the ease of use of MATLAB.
     
  14. Dec 4, 2015 #13

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    I have to disagree. While that might have been the case a decade ago, now that we're well into the 21st century and the era of open source software, tools such as MATLAB that are massively expensive, ridiculously slow, and extremely out of date are best relegated to the 20th century from whence they came.

    Mathworks (the company that produces Matlab) has long marketed to people who are not in the know: managers, professors, and students. The problem with that is that nowadays, managers were once engineers or scientists who were once in the know, professors were once graduate students who were in the know, and students now know how to program before they enter college.


    There is no one language that does it all anymore, nor is there any one person who can do it all. Those days are also something best relegated to the 20th century from whence they came. Nowadays, you might have stuff running on a home-brewed supercomputer that uses a mix of CUDA, C++, C, and Fortran. That set of programs communicates results to a python-based script that runs on some other computer, which in turn communicates its results to one or more visualization or analysis programs.
     
  15. Dec 4, 2015 #14

    f95toli

    User Avatar
    Science Advisor
    Gold Member

    I disagree. First of all, Matlab is reasonably fast if you know how to use it. The JIT compiler is actually quite good and most of the built in functions are state-of-the-art (the FFT is implemented using FFTW etc) and are fast as long as you vectorise your code.
    It also makes using e.g. CUDA very easy (we run a lot of our code on high-end gaming GPU, we might buy a proper Tesla card) which can speed up your code quite a bit

    Yes it is expensive, but it no more expensive that other professional software (try buying a license for microwave simulation software:)), that is VERY expensive). it is also relative: we use Matlab to control and analyse data from a measurement system that is worth about £2M or so. Compared to that the cost of a Matlab licence is insignificant, and using it means us being more efficient than using e.g. Labview and/or Python .
    Don't me wrong, I like and use Python ; but it is still not nearly as powerful and user friendly as Matlab for handling data without a LOT of tweaking.
    .
     
  16. Dec 4, 2015 #15
    Interesting comments, I guess I forgot to mention, I a retired engineer, so mat-lab is not the way to go. I have been a manager the last 20 years of my career so I have been out of the loop on modern languages. It looks like fortran is still an acceptable language, using something like dot.net to plot/draw out the results.

    I'm just looking to do some interesting simulations on some crazy ideas I had over the years but never had the time to pursue them. I'm also looking to teach my grand kids real programming.
     
  17. Dec 5, 2015 #16

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    MATLAB is well integrated with Simulink and can auto-generate C code. Unless other languages can keep up, they will not be adequate replacements.
     
  18. Dec 5, 2015 #17

    jedishrfu

    Staff: Mentor

    Ive been investigating Julia as an alternative. It has similar syntax to matlab but is orders of magnitude faster and integrates well with python, C, R and Fortran. The idea would be to replace production matlab code with it instead of doing a costly rewrite to C etc... And win over the engineers later.
     
  19. Dec 5, 2015 #18

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    If Julia works as advertised, it is certainly the way to go. I don't have any practical experience with it and would be interested in your results.
     
  20. Dec 9, 2015 #19

    rcgldr

    User Avatar
    Homework Helper

    At companies I've worked at in the past, some of the engineers use Visual Basic, because of it's drag and drop interface to design a screen layout, including graphs and charts, where Visual Basic generates all the user interface code for the initial screen and can update its part of a code base when the interface is updated. Most of the hardware involved in these projects could communicate via a serial interface, which apparently was fairly easy to deal with in Visual Basic. I've only used VB a couple of times to help out with projects.

    As for me, it's mostly C and C++, with some assembly.
     
  21. Dec 10, 2015 #20
    Is Julia a freeware package? Like I said, I'm retired and don't have an engineering budget
    .
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Modern programming languages causing cognitive dissonance
  1. Programming Languages (Replies: 19)

  2. Programming language (Replies: 10)

Loading...