Modern programming languages causing cognitive dissonance

  • Thread starter SlurrerOfSpeech
  • Start date
  • Tags
    Programming
In summary: From there, folks developed structured programming and then later Object Oriented programming all in an attempt to make programming more organized. Batch runs were replaced by timesharing, Job Control language (JCL) was replaced by script languages and other languages evolved into what you see today.The only dissonance I see is that the bar has been raised for programming skills where some folks pick it up real fast and others languish. Basic was very easy to pick up whereas Java is decidedly more difficult. However, tools like Processing make it a lot easier to learn Java.That the use of auto in C++ indicates laziness is simply a misconception.
  • #1
SlurrerOfSpeech
141
11
Anyone else experience extreme cognitive dissonance when learning modern programming languages? If so, have you visited a pschyiatrist about it?

1) Automatic type deduction. The idea that I can be too lazy to figure out the exact type returned from a function, and put var or auto in its place, feels wrong, like committing a crime.

2) Lambas. I'm used to hours spent going, "Ok, great, I could use a function here, so I'll go to my namespace Helpers and implement and name the function inside there." There is something unnerving about the idea that I can write unnamed functions on-the-fly. There has to be a catch ...
 
Technology news on Phys.org
  • #2
I've not experienced such an issue and I've been programming for literally for decades. Computers are very iteral especially if you deal with assembler language. Higher level languages provide ease-of-use features and organizational features that allow programmers to make larger more complex applications. When I started out the primary languages were Fortran/Cobol, Macro assembler and job control language and a bit later timesharing Basic. We chose the language based on the task to be done and sometimes based on the efficiency needed. As an example, complex math applications would naturally use Fortran over Cobol or Assembler and business applications such as payroll management would prefer Cobol. Assembler was chosen when extreme efficiency was needed or when low level operations such as reading a poorly formatted billing tape was needed.

From there, folks developed structured programming and then later Object Oriented programming all in an attempt to make programming more organized. Batch runs were replaced by timesharing, Job Control language (JCL) was replaced by script languages and other languages evolved into what you see today.

The only dissonance I see is that the bar has been raised for programming skills where some folks pick it up real fast and others languish. Basic was very easy to pick up whereas Java is decidedly more difficult. However, tools like Processing make it a lot easier to learn Java.
 
  • Like
Likes artyb and BvU
  • #3
That the use of auto in C++ indicates laziness is simply a misconception. Please read:

http://herbsutter.com/2013/06/07/gotw-92-solution-auto-variables-part-1/
http://herbsutter.com/2013/06/13/gotw-93-solution-auto-variables-part-2/
http://herbsutter.com/2013/08/12/gotw-94-solution-aaa-style-almost-always-auto/

It's hard to tell what your problem with anonymous functions is. Are you saying you would rather write a normal function or functor than a lambda even in the simplest scenarios with std::find_if? Why?
 
  • #4
SlurrerOfSpeech said:
Anyone else experience extreme cognitive dissonance when learning modern programming languages? If so, have you visited a pschyiatrist about it?

1) Automatic type deduction. The idea that I can be too lazy to figure out the exact type returned from a function, and put var or auto in its place, feels wrong, like committing a crime.

2) Lambas. I'm used to hours spent going, "Ok, great, I could use a function here, so I'll go to my namespace Helpers and implement and name the function inside there." There is something unnerving about the idea that I can write unnamed functions on-the-fly. There has to be a catch ...
That's lambda, named after the Greek letter ##\lambda## (lambda).
Lambda functions came out of the LISP programming language (I believe) and are now featured in C#, Python, and a number of other languages. One of the primary uses of anonymous functions, which lambda functions are, is in callbacks that are to be executed following some action. All the programmer cares about is that the action be performed in the proper order, such as an action that is to be performed following a button being pressed in a GUI. This blog post (https://pythonconquerstheuniverse.wordpress.com/2011/08/29/lambda_tutorial/) is specific to Python, but the point he makes is relevant to many other programming languages.
 
Last edited:
  • #5
  • Like
Likes DrClaude
  • #6
rcgldr said:

Mind ← blown
 
  • #7
APL is quite cool. I don't recall it having that many operators. I do remember IBM using the operators in its mainframe assembler manuals to describe the function of an opcode.

I also remember a humorous open session at a conference where Prof Iverson displayed his APL language and one of the GE engineers present ask an innocent question about his keyboard not having all those arcane symbols used in APL which prevented him from programming in the language.

Iverson replied: You expect ME to design a language for YOUR keyboard, while the engineer slunk back into his seat embarrassed by the laughter of the audience.

In truth, many programmers felt the same way (but were too chicken to ask) as APL was only available on IBM systems configured to use the 3270 terminal with APL keycaps. GE had a variant but you had to type the names of the operators as in rho for ##\rho##.

One of the programs he demonstrated was a prime number sieve in 23 keystrokes that could print out all primes under 1,000,000. The catch was that APL allocated an array 1,000,000 by 1,000,000 internally and row reduced it to a 1,000,000 element vector and then further reduced to a vector of the prime numbers. The 1,000,000 element vector has counts for the number of factors each number had with primes having only 2 themselves and one.

From that program you could see that the sparseness of the language meant you could use up a lot of memory resources for intermediate results. It also illustrated the write-once feature of the language.

I think the website rosettacode.org has some further examples of APL programs for various tasks.
 
  • #8
SlurrerOfSpeech said:
Anyone else experience extreme cognitive dissonance when learning modern programming languages? If so, have you visited a pschyiatrist about it?
You need to get used to this. Programming languages change over time. They go in and out of favor. Some languages just die while new languages pop up. I've tried to follow the advice set forth in "The Pragmatic Programmer", which is to learn a new programming language a year. (This was word-of-mouth advice long before that book came out.) This forces you to be mentally agile and flexible, and to keep current.

With regard to older languages such as C, C++, C#, Fortran, you can always keep programming in a style born in (and best relegated to) the previous millennium. The problem with this is that you'll soon find yourself outpaced by others who embrace a more modern style. The updates to those languages since the end of the previous millennium play an important role in keeping those languages competitive with newer languages. To pick an extreme example, suppose Fortran was frozen at FORTRAN IV. "There's a big difference between mostly dead and all dead." Fortran would be all dead (as opposed to mostly dead) if the language hadn't been changed multiple times since the mid-1960s.

You specifically wrote about automatic type deduction and lambda expressions. Supposed you have a map that maps strings to a shared pointer of some class template. Which would you rather type (or read):
C:
const std::unordered_map<std::string,std::shared_ptr<some_template<typeA, typeB>>>::iterator&
    found_item = map.find (key);
versus
C:
const auto& found_item = map.find (key);
Automated type deduction frees your mind to get on with the work that needs to be done.
In C++, lambda expressions can be implemented as an anonymous type that encapsulates the capture arguments of the lambda expression, has a constructor that captures those capture arguments, and a function call operator that takes the arguments to the lambda. You could write that anonymous class by hand (now no longer anonymous) that does all that. In doing so, your one line lambda has now ballooned to a dozen or more lines of code. The lambda represents an order of magnitude reduction in lines of code.

This is a huge win. Averaged out over time, the CMMI level 5 Space Shuttle flight software was written, tested, and maintained at the glacial pace of one line of code per person per day. A more modern metric for well-written and well-tested code is one line of code per person per hour. (That still sounds slow, but it isn't. That code has to be tested (the test code SLOCs don't count as delivered lines of code), documented (the documentation and briefings don't count as delivered lines of code), peer-reviewed (the reviews don't count, either), and delivered (the builds -- they also don't count.)) All of those things do however represent hours of people time, and that translates to dollars. The lambda expression represents an hour's worth of work, or roughly $100 using a standard metric of $100 per person per hour. The corresponding functor class represents a over a day's worth of work, or $1000. Multiply that over and over and you are getting into serious money.
 
  • Like
Likes SlurrerOfSpeech
  • #9
jedishrfu said:
APL keyboard.
There was a keyboard layout sheet that most used:

aplkeyb.jpg


For some keyboards, you can/could order a set of APL keycap replacements. Some current APL providers also sell APL keyboards for PC.

For PC console window usage, there are/were programs to change the screen font to APL characters. This isn't an issue for GUI based APL interpreters.

APL has a feature similar to auto, variables are automatically type converted by math operations, like a integer vector will get converted into floating point if a floating point value (usually the result of some calculation) is stored in any member of the vector.
 
Last edited:
  • #10
When I started to play with APL, there were no PCs only terminals. Honeywell, in particular had a poor version of APL that used an ASCII terminal where you were limited to typing in the three letter Greek name for the operator. We did have the keyboard diagram but it didn't work well with a non IBM terminal.

Once the PC came out, APL became available there but it was still no match for the mainframe version which was faster and had more virtual memory to play with.

The modern day successors to APL is probably MATLAB and Julia which favor matrix operations like APL.
 
Last edited:
  • #11
OK, so what is the premier engineering programming language of today? I have written many engineering simulations in Fortran, I have also used C and C++ to control embedded controllers, and even some windows graphic programs. From that experience I don't think I would want to use C or C++ to code up an engineering simulation, graphics output OK but not the bulk of the number crunching needed in finite difference, of finite element methods.
Aule Mar
An old engineer
 
  • #12
My guess is that most engineers would use MATLAB for any engineering simulations.

Why?

Because they are encouraged to buy the student edition when they are in college and then later have their company buy the full version once they start working.

After a while though they company may tire of MATLAB's cost and then will look for alternatives like Octave or a numerical Python distribution or more recently Julia. However none of these has the ease of use of MATLAB.
 
  • #13
jedishrfu said:
My guess is that most engineers would use MATLAB for any engineering simulations.

I have to disagree. While that might have been the case a decade ago, now that we're well into the 21st century and the era of open source software, tools such as MATLAB that are massively expensive, ridiculously slow, and extremely out of date are best relegated to the 20th century from whence they came.

Mathworks (the company that produces Matlab) has long marketed to people who are not in the know: managers, professors, and students. The problem with that is that nowadays, managers were once engineers or scientists who were once in the know, professors were once graduate students who were in the know, and students now know how to program before they enter college.
AuleMar said:
OK, so what is the premier engineering programming language of today?
There is no one language that does it all anymore, nor is there anyone person who can do it all. Those days are also something best relegated to the 20th century from whence they came. Nowadays, you might have stuff running on a home-brewed supercomputer that uses a mix of CUDA, C++, C, and Fortran. That set of programs communicates results to a python-based script that runs on some other computer, which in turn communicates its results to one or more visualization or analysis programs.
 
  • Like
Likes jedishrfu
  • #14
D H said:
I have to disagree. While that might have been the case a decade ago, now that we're well into the 21st century and the era of open source software, tools such as MATLAB that are massively expensive, ridiculously slow, and extremely out of date are best relegated to the 20th century from whence they came.

I disagree. First of all, Matlab is reasonably fast if you know how to use it. The JIT compiler is actually quite good and most of the built in functions are state-of-the-art (the FFT is implemented using FFTW etc) and are fast as long as you vectorise your code.
It also makes using e.g. CUDA very easy (we run a lot of our code on high-end gaming GPU, we might buy a proper Tesla card) which can speed up your code quite a bit

Yes it is expensive, but it no more expensive that other professional software (try buying a license for microwave simulation software:)), that is VERY expensive). it is also relative: we use Matlab to control and analyse data from a measurement system that is worth about £2M or so. Compared to that the cost of a Matlab license is insignificant, and using it means us being more efficient than using e.g. Labview and/or Python .
Don't me wrong, I like and use Python ; but it is still not nearly as powerful and user friendly as Matlab for handling data without a LOT of tweaking.
.
 
  • Like
Likes jedishrfu
  • #15
Interesting comments, I guess I forgot to mention, I a retired engineer, so mat-lab is not the way to go. I have been a manager the last 20 years of my career so I have been out of the loop on modern languages. It looks like fortran is still an acceptable language, using something like dot.net to plot/draw out the results.

I'm just looking to do some interesting simulations on some crazy ideas I had over the years but never had the time to pursue them. I'm also looking to teach my grand kids real programming.
 
  • #16
jedishrfu said:
My guess is that most engineers would use MATLAB for any engineering simulations.

Why?

Because they are encouraged to buy the student edition when they are in college and then later have their company buy the full version once they start working.

After a while though they company may tire of MATLAB's cost and then will look for alternatives like Octave or a numerical Python distribution or more recently Julia. However none of these has the ease of use of MATLAB.
MATLAB is well integrated with Simulink and can auto-generate C code. Unless other languages can keep up, they will not be adequate replacements.
 
  • Like
Likes jedishrfu
  • #17
Ive been investigating Julia as an alternative. It has similar syntax to MATLAB but is orders of magnitude faster and integrates well with python, C, R and Fortran. The idea would be to replace production MATLAB code with it instead of doing a costly rewrite to C etc... And win over the engineers later.
 
  • Like
Likes FactChecker
  • #18
jedishrfu said:
Ive been investigating Julia as an alternative. It has similar syntax to MATLAB but is orders of magnitude faster and integrates well with python, C, R and Fortran. The idea would be to replace production MATLAB code with it instead of doing a costly rewrite to C etc... And win over the engineers later.
If Julia works as advertised, it is certainly the way to go. I don't have any practical experience with it and would be interested in your results.
 
  • Like
Likes jedishrfu
  • #19
At companies I've worked at in the past, some of the engineers use Visual Basic, because of it's drag and drop interface to design a screen layout, including graphs and charts, where Visual Basic generates all the user interface code for the initial screen and can update its part of a code base when the interface is updated. Most of the hardware involved in these projects could communicate via a serial interface, which apparently was fairly easy to deal with in Visual Basic. I've only used VB a couple of times to help out with projects.

As for me, it's mostly C and C++, with some assembly.
 
  • #20
Is Julia a freeware package? Like I said, I'm retired and don't have an engineering budget
.
 
  • #21
Yes, you read more about it at julialang.org. It doesn't come with the IDE that MATLAB has but there are some alternatives like Juno Editor and IPython notebook that make it fun to program.

For further details you should consider opening a separate thread to discuss julia.

Since this thread has run its course, it will now be closed. Thank you all for your contributions.
 

1. What is cognitive dissonance in relation to modern programming languages?

Cognitive dissonance refers to the mental discomfort or conflict that arises when there is a discrepancy between one's beliefs and their actions. In the context of modern programming languages, it refers to the confusion or frustration experienced by programmers when learning and working with new programming languages that differ significantly from what they are used to.

2. Why do modern programming languages cause cognitive dissonance?

Modern programming languages often introduce new concepts, syntax, and paradigms that differ from traditional programming languages. This can cause cognitive dissonance as programmers need to unlearn old habits and adopt new ways of thinking and problem-solving.

3. How can cognitive dissonance affect a programmer's productivity?

Cognitive dissonance can hinder a programmer's productivity as it can lead to confusion, inefficiency, and mistakes. When a programmer is struggling to understand a new programming language, they may spend more time trying to figure things out rather than actually writing code.

4. What can be done to reduce cognitive dissonance when learning a new programming language?

To reduce cognitive dissonance when learning a new programming language, it is important to have a positive attitude and approach. It can also be helpful to break down the learning process into smaller steps, practice regularly, and seek guidance from experienced programmers or online resources.

5. Is cognitive dissonance a common experience among programmers when learning new languages?

Yes, cognitive dissonance is a common experience among programmers when learning new languages. As the field of programming continues to evolve and new languages are introduced, it is natural for programmers to experience some level of cognitive dissonance. However, with practice and persistence, it can be overcome.

Similar threads

  • Sticky
  • Engineering and Comp Sci Homework Help
Replies
1
Views
13K
  • General Discussion
Replies
10
Views
3K
  • Biology and Medical
Replies
6
Views
5K
  • General Discussion
Replies
11
Views
25K
Replies
6
Views
3K
Back
Top