Revolutionary Changes in Programming: Past, Present and Future

AI Thread Summary
The discussion highlights the evolution of programming, noting that object-oriented programming was a significant revolution, with recent advancements like smart pointers in C++ and frequent updates in Python. Participants question whether these updates introduce substantial new functionalities or are merely maintenance improvements. Innovations such as Google TensorFlow for machine learning and Elm's time-traveling debugger are mentioned as noteworthy developments. The conversation also touches on the advantages of using JVM languages and the Processing IDE for interactive Java graphics. Overall, while some new languages and tools show promise, the search for a revolutionary change akin to object-oriented programming continues.
Hercuflea
Messages
593
Reaction score
49
I'm just a casual/academic programmer. The last major revolution in programming (that I know of) was object oriented programming. But some of the latest developments (in C++) that can actually be used by everyday programmers that I can think of are things like smart pointers which were only really implemented in 2014. Python gets updated every few months to a new version. Are these updates bringing new ideas/functionality to the languages that programmers can use or are they mostly just housekeeping? Are there any big "revolutions" in the works with the same kind of magnitude as the change to object oriented programming?
 
Technology news on Phys.org
Another example: Google Tensorflow. It was recently released as a plugin for python. It is meant for machine learning, but it can actually be used to do numerical partial differential equations in parallel.
 
Elm is an interesting programming language which is primarily functional. It has the time traveling debugger which is quite cool. There are several videos on YouTube by the creator Evan Czaplicki.

https://en.m.wikipedia.org/wiki/Elm_(programming_language)

There's node.js where you can develop the server side of web apps in JavaScript so your web app can use one language for both client and server side which makes sharing data via json format seamless.

There's JVM languages like groovy, Scala, Clojure and jython that interoperate with Java and can use the many third party libraries. Groovy is like super Java whereas Scala fixes many Java architectural issues and Clojure and Jython are super lisp and Python that can work with Java libraries.

There's the processing IDE which makes writing Java fun. It has a lot of cool interactive Java graphics examples. It's also great for prototyping ideas too. See processing.org for more info.

There's Julia from MIT that seems on track to be a better MATLAB but with datatyping to speed calculations. It also takes a page from OO in that you can use different calling signatures on functions for polymorphic type calls i.e. The function does one thing when an art is an integer and another thing when it's a vector, see Julialang.org
 
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
What percentage of programmers have learned to touch type? Have you? Do you think it's important, not just for programming, but for more-than-casual computer users generally? ChatGPT didn't have much on it ("Research indicates that less than 20% of people can touch type fluently, with many relying on the hunt-and-peck method for typing ."). 'Hunt-and-peck method' made me smile. It added, "For programmers, touch typing is a valuable skill that can enhance speed, accuracy, and focus. While...
I had a Microsoft Technical interview this past Friday, the question I was asked was this : How do you find the middle value for a dataset that is too big to fit in RAM? I was not able to figure this out during the interview, but I have been look in this all weekend and I read something online that said it can be done at O(N) using something called the counting sort histogram algorithm ( I did not learn that in my advanced data structures and algorithms class). I have watched some youtube...
Back
Top