What Could Be the Next Evolution in Programming After OOP?

  • Thread starter Thread starter Greg Bernhardt
  • Start date Start date
  • Tags Tags
    Method Programming
Click For Summary
The discussion centers on the future evolution of programming beyond Object-Oriented Programming (OOP), with suggestions like Table Oriented Programming (TOP) and a need for standardized methods of program inheritance. Participants emphasize the importance of user-friendly interfaces that allow non-experts to easily integrate features from different applications, akin to a drag-and-drop functionality. There is a call for a more intuitive programming approach that minimizes the complexity of integrating disparate codebases, making programming accessible to everyday users. The conversation also touches on the potential for fuzzy logic and AI-assisted programming to simplify coding tasks. Overall, the consensus is that the next major leap in programming will focus on enhancing usability and standardization.
  • #31
I haven’t seen a stance which starts from a low level perspective.

The main corporate problems revolve around code maintenance and debugging leaving very little impetus for a language change.

The compiler allows abstraction of the computers hardware: instruction sets, cache alignment, memory bus contentions, and the interface to every possible piece of hardware. Yes the ultimate abstraction is to be able to use natural language to tell the computer what to do and have it done… eventually.

I believe that is neither realistic nor a desirable goal. It is possible, and really it just sidesteps the issue –what do you write the compiler in?-

It is not the language which needs the greatest change but the development environment.
Compilers presently perform very little communication to the user. For debugging purposes one needs an abstraction to ease conceptualization of what the software is accomplishing. For optimization purposes one needs the code shown as the computer will see it. A tremendous amount of screen estate and idle processing power is available to currently handle these tasks but they remain ignored.

To selfAdjoint:
These languages can handle parallelism; it is up to the compiler to implement independent threads for each parallel operation and for memory access to be dealt with on the hardware side. The main problem is that many algorithms can not be improved by parallel processing; the output of one stage is required for the next.

To StarkyDe:
Despite biological self-replications ability to produce enormous parallelism, its chemical thermodynamics require equilibrium stages and therefore will be individually much slower than their solid state equivalent. (see above)

To: dduardo:
Custom macros or procedures in libraries are useful abstractions; although they may interfere with understanding. Many languages are defined by their standardized libraries, but their will always be the need for new task specific tools. The problem which causes such chaotic use is that everyone holds copyright on their libraries preventing anyone else from using the same code unless authorized. Even General Public License code will not be used by your employer if they are trying to sell their product.
 
Computer science news on Phys.org
  • #32
It's not "what is going to be the next programming language or method" but what is going to be the standard software engineering process as opposed to just hacking. The trend is toward graphic representation of software and standardized components. Just look at the electronics industry with electronic schematics and standardized components! Imagine software written by use of schematics and standard components, how much easier it would be to document, design and maintain.

A picture speaks a thousand words and despite what many believe about vocal interfaces being the natural man machine interface they're really not. Sound is a serialized process in our brains and doesn't exploit our parallel processing capability as does visualization. So the real bottle neck is inputing information to the computer from human beings. There are military and even some consumer products that can use eye movement or even actual brain tissue interfaces. So we have good graphic interfaces we need better input interfaces to become more productive with machines.

The future is software schematics utilizing OOP and off the shelf components. A good source for learning more is the OMG group that promotes UML.
 
Last edited:
  • #33
Many posters have written here about standardization, and self-adjoint makes an excellent point about multiprocessing.

You can debate whether greater standardization is desirable because it makes programmers jobs easier, and whether it stimulates new development by letting them concentrate more on end results without being distracted so much by details, or stifles new development by locking them into "old" ways of thinking.

And it will be interesting to see to what extent programming will embrace multiprocessing, and, conversely, to what extent we will simply rely on the next generation of (single) processors to relieve us of having to make that extra effort.

Much can be said about these topics but neither is a "method of programming" in the same sense that procedural, event and OOP are. Procedural languages can be standardized or non-standardized. Procedural-based programs can be designed to take advantage of multiprocessing, or not. The same can be said about event-driven, or object-oriented programs.

What do y'all think of declarative programming

http://en.wikipedia.org/wiki/Declarative_programming

as a candidate for the "next" programming paradigm?
 
  • #34
I was just reading a short tutorial in declarative programming last week, via slashdot. It looks like it has some good features, including the ever marketed modular capability (you wouldn't want to know how far back I go with that idea. It's always the big selling point of any new method. Reuseable code! Yeah!)

I always ask, how does it go with data base, with massaging and updating huge amounts of stored data? IIRC, that's what hobbled OOP.
 
  • #35
Speaking as a novice: is oop "hobbled"?

Do you still have a link to that declarative programming tutorial? What little I've seen about it seems intriguing; I'd like to read more.
 
  • #36
rick1138 said:
There will never be a standardized language, by virtue of Godel's incompleteness theorem
Don't you mean Murphy's Law?
 
  • #37
Ease of Programming

dduardo said:
Event driven systems are a way to collect input in an OO enviroment. An event triggers an object to run which then triggers other events, creating a dynamic system.

This methodology of thinking is not going to change. The next major leap in programming will be the way we enter programs into the computer.

What plagues programmers today is the constant reinvention of the wheel. Sure there are functions, allowing repetitive tasks to be called from any point in the program, but that's not enough. We need to STANDARDIZE a method for program to program inheritance. It will be like the click and drag method of creating interfaces in VB, but more sophisicated.

Imagine being able to take the chat (function) from XYZ's instant messenger and dragging it to ABC's first person shooter.

With this type of capability, programming will not be just for the experts, but for the average joe that wants to be productive.

You do realize that this will probably put a lot of programmers out of jobs. The whole aura of being a programmer will be lost, aswell. Its one of those hard things that people take pride in knowing.

However, somebody will still have to mastermind that "chat function [in] XYZ" to allow it to be ported to various other projects regardless of their final intentions - a form of abstract coding, if you will.

Then again, imagine how flaws and exploits could be done. How about your average Joe taking some exploit from that chat function in XYZ, now a billion programs with the same copied code will be exploitable aswell. This would be a tremendous blunder!
 
  • #38
Yah, but then it would be easy to drop in a new chat function. :smile:
 
  • #39
C++ & Java works fine. Better if they combined. ie. add preprocessor from C to java and class instead of struct in C would be better.
 
  • #40
Microsoft C sharp

The programming methods are limited by what's available. since the x86 isn't going away anytime soon, when it comes to a generic PC, C# is the next thing; everything one needs is there.
 
  • #41
reality check

I always thought of getting a bunch of checks printed out that were labled "reality check". I could write them out to stupid people the world over.

but that's besides the point.

Today's reality check.

you don't have to make it free, you just have to make it cheap.

Free things are buzz words, but they never last. The companies can't sustain the manufacture, and they are forgotten shortly after the fad, fades.

Taker Windows for instance. It is cheap. Not by the standards of modern computer users, who know nothing of history, and probably assume an atom and an electron are the same size. (recent physicsweb article- sci fi movies might be more educational to half the Americans than their school systems)

However, if you remember the days of the 30,000 $ home personal computer, or the million $ budgets for room sized mainframes, then you might also recall, that "yes indeed... Windows is cheap".

The next thing a new programming system needs is "user friendly" also known as stupid friendly. sometimes user friendly to a good programmer is user unfriendly to everyone else, and vice versa. You don't actually need all the positive features possible in a program, you just need the ones that everyone in pop culture wants, and understands.


So in truth, what would be a positive change in programming ?

Intuitive programming. I recently demonstrated to a young friend of mine, how you could back up a person's entire genetic code on one CD. (although you would be hard pressed to put more than one person on there, you could definitely fit multiple insects, prokaryotes, etc. therein.)

Despite being able to back up a human being, biologically, on a disk, like this, you would need to remember that the human being is itself, not Data, but another dynamic multipurpose tool.

Thus you get into the practicalities of simulations.

I've also noted that in the future, data is going to cease being "solid material" based and become "field" orientated, especially pushing towards the ideal of fractal designs.

When a monitor or VR interface begins to account for fields instead of nodes, and uses harmonics, to blend optical octaves, the color combinations and visualizations will become unlimited.

Right now one of the biggest problems of video in movies is cloning verses bit rate rendering. For instance, you could say one drop of rain is X number of bits with resolution, and copying them in a 4 d environment could take some kind of algorithym, but the doom of such values becomes sick.

If instead, the natural patterns were used, the processing power required would be greatly reduced, while, with built in fractal blending, the resolution would be greatly increased.
 
  • #42
that movie with tom cruise in it was called minority report...
 
  • #43
BTW, we did a version of Minority Report gloves.

http://dftuz.unizar.es/~rivero/alumnos/vmouse.html
 
Last edited by a moderator:
  • #44
samwith said:
Boring, ja das ist sehr langweilig

Nobody says you have to read it. We're not here to entertain you.
 
  • #45
AI fi you can make it ..... Good luck
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
Replies
21
Views
5K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
43
Views
6K
  • · Replies 13 ·
Replies
13
Views
761
Replies
16
Views
2K
Replies
2
Views
2K