The next programming method?

In summary, the next step in programming might be something called TOP (Table Oriented Programming). The way humans develop ideas makes OPP very natural, and event driven systems are a way to collect input in an OO enviroment.
  • #1
For you computer geeks out there. What do you think the next method of programming might be? We currently have Procedural, Event and OOP. OOP is the most popular, but it isn't the best. What is next?
Computer science news on
  • #3
OPP is not a method of programming but rather a way of thinking. The way humans develop ideas make OPP very natural. For example, you want to build a cardboard box. Although the process for creating a box is linear, developing the process is very much OO. You have to figure out how what the material is made, the size , the color, etc. All of these attributes are easily contained in an OO structure such as a CLASS. All the attributes can be built onto of each other. This layering allows the developer to visualize the structure in a very precise manor.

Event driven systems are a way to collect input in an OO enviroment. An event triggers an object to run which then triggers other events, creating a dynamic system.

This methodology of thinking is not going to change. The next major leap in programming will be the way we enter programs into the computer.

What plagues programmers today is the constant reinvention of the wheel. Sure there are functions, allowing repetitive tasks to be called from any point in the program, but that's not enough. We need to STANDARDIZE a method for program to program inheritance. It will be like the click and drag method of creating interfaces in VB, but more sophisicated.

Imagine being able to take the chat (function) from XYZ's instant messenger and dragging it to ABC's first person shooter.

With this type of capability, programming will not be just for the experts, but for the average joe that wants to be productive.
  • #4
I concur with dd; some sort of standardized abstraction is definitely the way to go. There needs to be a way to write powerful code with no loose ends which others can easily use and customize.
  • #5
Well, the thing is that ABC would have to pay XYZ for that. I mean, we already have code libraries and APIs and such so that you don't have to "reinvent the wheel." Also, people often like to create things for themselves marvel at their own genius.

Although, it can be hard to integrate things from completely different codebases. For example, a game will have different structures for representing clients than will a chat program.
  • #6
That's pretty much the reason why I think a good standardization or abstraction scheme is where things need to go; so that the chat program can use the FPS's client object directly, or at least with a trivial wraper.
  • #7
Originally posted by Dissident Dan
Well, the thing is that ABC would have to pay XYZ for that. I mean, we already have code libraries and APIs and such so that you don't have to "reinvent the wheel." Also, people often like to create things for themselves marvel at their own genius.

Although, it can be hard to integrate things from completely different codebases. For example, a game will have different structures for representing clients than will a chat program.

1) If your a proprietary software maker like Microsoft, then yes, you probable would have to pay a licensing fee. But software development is changing and IBM, Apple, Toshiba, etc are realizing that open source is the way to go.

2)People just want software to work. Period. Sitting in front of my computer, I wish that some applications could take on the attributes of others. The everday users aren't going to fire up their text editor to modify source code to add features.

An example: Average Joe likes the tabbed browsing feature in mozilla and would like to add it to the OpenOffice Writer in order to be more productive. If every feature in mozilla was a standarized template, then one could drag a tab from mozilla into openoffice, and all the attributes from the tab would be inherited.

Ok, sure code libraries and APIs exist so you don't have to "reinvent the wheel," BUT it takes time to find out if a library is available that suits your needs. And on top of that you need to figure out the contruction of the library and how to make calls. I mean, if the feature you want is right in front of you, and already implemented, then why can't you just drag that feature where you need it visually, rather than through a text editor.

If we have a standarized abstraction, as Hurkyl said, programmers will build the components and arrange them. The computer then figures out how to connect the components. If the user wants to an a component, all they have to do is point to where they want the new component and the computer figures out how to connect it.

3) If you have a standarized template, it shouldn't matter what type of data your dealing with. For instants, STL(Standard Template Library) for c++ is very flexible and allows its data stuctures to contain any type of structure.

struct bob
int x;

queue<bob> builder;
  • #8
I have to agree, a related development has been in gooey interfaces. The new Apple Aqua system is a case in point. It allows you to make any picture you want an icon and just arrange things however you want. The ultimate interactive etch-a-sketch so to speak but without all the hassle of the two knobs and having to erase everything the minute you make a simple mistake.

In other words, two trends are at work here: expanding the parameters of the system and making them more user friendly. Microsoft already has patents on 3-d versions of windows, for example, but who the heck owns a 3-d monitor? The real rush then is not to predict the immediate future of the technology, but what makes things more user friendly. This has been microsoft's focus all along, and why their modular architecture and bells and whistles are so infamous for being unstable.
  • #9
Originally posted by wuliheron
This has been microsoft's focus all along, and why their modular architecture and bells and whistles are so infamous for being unstable.

Microsoft and Modular Architecture. HAHAHAHAHAHAHAHAHAHA

Removing Internet Explorer = Removing Windows
  • #10
well I am not sure I understand the question, but when programming Multi User Systems (some games, some not), we found a stable metaprogram with user functions that allowed other authorized people to change the environments & code were very effective.

This kind of system though is only as powerful as the metaprogram, but allows otherwise uneducated users to revise a system with some measure of variability.

One of the systems I worked on had a rather suprised System Admin when I was, sitting at the bottom of the ranking system, able to use Mobile subroutines to delete his account, as if he were the one doing it. I think this is an example of exactly how powerful Online Coding can be.

For a more radical aproach, I think we need to devise anti-logic processors, which is possible with something akin to a profile system for each Flag

instead of "let red = 45"/"let red = random (1-300)"

we should have a whole list of possible meanings for subject matter as step 1
let red = 45; and/or random (21-69); and/or temperature code "hot"; and/or emotion code "angry"

Followed by step 2 of the anti logic code, (this is the real step) A search engine processor that matches a programmer's input structure and/or sentence structure to lists of flags with possible matching data.

Basic example " Make the room hot"

I know this sounds wierd, but in programming, if we spliced a fuzzy logic search engine into a bunch of higher language protocals, we would have a very real, synthetic artificial intelligence programming aid, which even a 6 year old could toy with.

I'm sure it wouldn't work well for a serious programmer, at not until the AI/search engine utility feature were highly attuned and possibly (through cookie subroutines) personalized or "watermarked" to the individual programmers.

Imagine a programmer named Keichi... and his AI search engine-meta utility named "May"

Keichi: " May, make the background blue please..."
May changes the background color to Azure
Keichi: " May, you know I hate this shade of blue... "
May changes the background color to Deep Blue
May: "Sorry sir, just kidding"
Keichi: " Thank you May"

the quirky thing about humans is the fact that they have favorites, and favorites can be set as defaults. Humans also splice things and mix and match them, something a computer could do, but normally requires whole arrays of programs to do.

The above style of programming could be infinitely complex, or as simple as a search engine spliced into another programming platform. Commands like " name that line of code that I spent the most time on, now insert it here (cursor)"

Further down the road, I noticed that they built a very basic cybernet eye for someone that inputs information directly into the brain. I also saw a person in an intensive care ward, moving a mouse cursor with their mind - the wires in their brain were sensing nerve motions normally reserved for things like arms and hands.
I believe that this kind of technology, in the next 20 years, could be used for a whole new variety of virtual reality. And if there is indeed a field of sensory data that fires up somewhere in the brain when someone imagines, then it is possible, that such an imagining could be recorded in a VR environment, replayed,
and if someone had a brilliant idea for a flash, and then forgot it, it could be rewound frame by frame, paused, and studied. I have no idea about the implications of some kind of internet based wireless mind reading/telepathy, but it is obvious that a person could definitely, with technology today, project wirelessly basic ascii characters from brain to brain, although the process would be very slow.
  • #11
Mind Brain Reality Software Applications [Broken]

I know people do not like when I post stand alone posts, but the idea here is to understand from the previous posted link to see the idea in how such application might work.

The fifth element idea sprang into existence engaging the idea of the superstringtheory board. I have called it the fifth element for specific reasons, and requires the understanding of connecting not only geometrical consideration under one umbrella( Felix Kleins Order of geometries- linked previous) and the unification of mind in the undertanding of enveloped bubble technology.

I am open to corrections in my thinking

Last edited by a moderator:
  • #12
What Is Driving the Technology [Broken]

This posted link is in answer to the "light of all things"(Einstein Mysticism) and its motivation based in our quest for?

Planck Length measure, becomes the signal for Fuzzy language. Yet what is Fuzzy language and you learn to see the progression of the logic that is taking place.

Venn Logic and Tranactional Analysis is specifc here on the direction we have to go in terms of soft computer language developement. This is based on the understanding of the Fifth Element containing all geometries. Enveloped in bubble technology.;action=display;threadid=2631 [Broken]

Last edited by a moderator:
  • #13
The Next Programming Method [Broken]

A ongoing work in process.

I am always open to corrections.

Last edited by a moderator:
  • #14
Back on track

Programming has consistently evolved in a pyramid, with the previous "standard" code being used to construct the next generation. So it stands to reason that OOP will become the building blocks of the next language. I'm thinking along the longs of the internet, and the way everything is interlinked and cross referenced. I thought the referenced article in the TOP one, which references the weaknesses of hieraccacal tree structure weaknesses was a good indication of the way things are going. If standardization can become a universal term, then you will eliminate a huge duplication process which plaugues programming. Interoperability is the key word, I think. Fortunes are made by people do nothing more than create the code to help programs and databases communicate with each other. I'm not a programmer, but I've worked with, and known many programmers, and that is the greatest flaw. Too many proprietary programs which can't communicate because companies are so concerned with intellectual property rights and proprietorship.

Personally, I'd like to see a programming interface similar to the type that we saw in that last tom cruise sci fi flick (can't think of the name). Just point(with your hand) and select, or verbal commands. This would speed the process up 500 times.


"computer, take subroutine alpha and insert to line 15."
"computer access msdn july 2040 from the microsoft programmer database and extract basic verbal interface array. Add to line 73, then compile."

I think this is very realistic scenario given Moore's law. Realtime verbal interface is only 10-15 years away- if that. It already has a rudimentary beginning. That's the way I see things headed.
  • #15
There will never be a standardized language, by virtue of Godel's incompleteness theorem, nor is it desireable - different languages evolve largely because they are the most suitable for the task at hand, and the abilities and prejudices of the programmers involved.
  • #16
There will never be a standardized language, by virtue of Godel's incompleteness theorem, nor is it desireable - different languages evolve largely because they are the most suitable for the task at hand, and the abilities and prejudices of the programmers involved.

I don't see how Godel's incompleteness theorem applies. Also, it is certainly conceivable that a standardized language could be general enough to do any task.
  • #17
John McCarthy has just asserted on s.p.r. that all programming languages are universal, that anything you can code in one you can code an emulation of in another. Grant the McCarthy is more knowledgeable than I - he is sort of the AI opposite number to Professor Kaku - I find this hard to believe. Emulate machine language in COBOL?
  • #18
I'm sure he'd be the first to admit that it wouldn't necessarily be easy, efficient, useful, or most other good adjectives. The 'all' seems rather sweeping though.
  • #19
I really don't think OO is where programming is headed. Let me speak generally: OO is fine for some things, but some things don't work well conceptually when you try to think of them as objects. The main benefit of OO is that it's supposed to be more in line with the way people think--but some things are conceptualized as processes, not objects. Hence procedural languages.

Take Java, for instance. It's taken OO to the extreme (not to mention all the buzzwords and BiCapitalizedMumboJumbo). Not good.

Also, there's yet to be a truly successful OO language. I mean, sure, you can talk about Java and C++ and even Objective C if you really want to, but nothing's really caught on that's been as well designed as C (or my personal favorite, lisp). The current object oriented languages are all, at best, implemented mediocrely. Java's too slow, C++ is just...C++. It can't replace C. If OO is going to redefine the future of computing (doubtful), it needs to replace C with a viable alternative, a standard. That's not happening. Java's probably already on its way out with the advent of C#. C++ isn't going to survive Java and C#. OO can't succeed until there's a really well implemented model.
  • #20
My opinion - admittedly I've been away from the field a few years - is that OO got big because it could match the hierarchies of screens and windows in GUIs. All that inherritance built in was a boon to front end designers. But your point on process is very well taken, and OO has problems with saving and organizing its data "persistently", and the larger design issues get reduced to canned methodologies - it's hard to be original or creative above the simplest cases. But probably these criticisms are out of date.

Here's a thought you might want to respond on, maybe the whole programming thing will go away the way kerosene headlamps did in cars. Currently it's being outsourced to Bangladesh, and in the near future, automated programming - long foreseen - may finally arrive. Of course that's Vinge's singularity again!
  • #21
Originally posted by selfAdjoint
Here's a thought you might want to respond on, maybe the whole programming thing will go away the way kerosene headlamps did in cars. Currently it's being outsourced to Bangladesh, and in the near future, automated programming - long foreseen - may finally arrive. Of course that's Vinge's singularity again!

I suppose this is possible…. But, engineers will always need some from of language to construct from scratch with….

I think the next major advance in programming language will include some from of AI. Possibly, this future language will have the ability to create its on functions to optimize code execution… That could, of-course, lead to problems…lol

Also, I would like to point out how long it’s taken programming languages to evolve.
Look how long it took to get a language that understands the difference between a string and numeric data type.
  • #22
As BioChemo-physicists have said for many years, by re-constructing computer components to replicate the biochemical movements of living things, it will be the most simple and realistic way to provide faster access to everything about a computer. Since everything we know about computers is broke-down into binary, all programs can be written into one major code-but as many of you have said- economics and business have separated these programs into many different ones because of money matters. What does the future hold for programming? i would say not only verbal reconstruction of programs, but also sight,heat and movement- just as that tom cruise movie{what the hell is the name of that?}. i feel biological applets- the formation of electrical current through the use of bacterial synthesis is the leading research for computers of the fututre. can you imagine watching your computer grow as you develop and store info into its system..far out but not impossible!
  • #23
What does OOP Stand for??

What is its meaning? I have done some basic programming, like Delphi and HTML. Which category do these fall into. Can someone explain to me what other types of programming there are?
  • #24
object oriented programming
  • #25
Eventually, I think someone will develop an interface which can turn the electrical emissions of our thoughts into binary, with enough exactness to translate our wishes into code. No doubt there will be many languages and paradigms. but I think they will all be generated in the compiler by this method.
  • #26
Ok well I am not sure what the next big kind of programming language will be but by reading this forum posts some of you think it should be easy enough for the average joe to change the source code. I totally disagree. Humans are getting smarter so why not make more complex code. Such as AI, complexity is what you want. Screw the average joe, if he wants to change source code then go read a damn book or hire someone to do it.
  • #27
Were smarter on the intelligence side perhaps but we still lack wisdom. Maybe try to think of it as, is there a 'wiser' programming method that could be concieved in order to make more use with the talent that we have?

At the moment you could be the most intelligent person on the planet.. read every single c++ book out there but youd still have no portability if you compiled it on a piticular machine.
now Java takes care of this.. your able to run java programs on any machine you want supposedly, but you have to take a drop in performance in order to pull it off by creating a virtual machine that it can translate everything through.

How far down the road were talking? Will quantum computing come into the picture and will a language need to be created around that or will we be in a photonic computing enviroment?

The hardware has a lot to do with the development of programming and until we figure out where the new evolution of computers is taking us it will be hard to make any dramatic leaps or changes in languages.

  • #28
I've noticed lately that supercomputers are being built of hundreds or thousands of parallel CPUs. I think that as CPUs shrink home computers will have more and more CPUs in parallel, and they will be linked on the net into many parallel efforts, as some of them are already.

Contemporary languages like c++ or Java are not well adapted to parallel computing. Indeed, taking advantage of the parallelism has been a problem with language designers for many years.

I don't know how this problem will be solved, but I am sure it will be; the promise of parallelism is too great to fail because of bad code.
  • #29
Not sure how much relevance this post has, just trying to add something to get things moving again...

I recently read about the new pci-express slots that are going to be coming out in the near future. Apparently they use some kind of time-stamp within each packet of information to make sure all the parallel signals are matched up i guess you could say?

With quantum computing is there any rumors or research anyone knows of as to how much time would be a relevant factor in programmer per say?

Or with photonic computing (not sure if its a real field of study or more just a topic still) where the photons could pass through each other, would need to be time-stamped to match up with photons belonging to a piticular stream of information?

  • #30
Originally posted by Greg Bernhardt
For you computer geeks out there. What do you think the next method of programming might be? We currently have Procedural, Event and OOP. OOP is the most popular, but it isn't the best. What is next?

The deciding factor on the next method of programming or designing programs
will, as it always has been, be largely "market driven".

Acutally, I don't mean "market driven"in the sense everyone uses it, or even I use it.

By "market driven", I mean to say, whatever can drive the market to pay the most money
to have to buy as much new software as possible to learn it, develop it, and put it into production.

The return, for the Professionals that use it, or for the ppl they code for, as usual, will be no better than the previous method, by any significant degree, and will be designed to make the purchaser more dependent upon whomever they purchased their stuff from.

Yes, all of a sudden, OOP, OOD, was the greatest, and now that there are new things that we
are forced to use or be left in the dust, OOP/OOD is no longer any good...

Supposedly, Web Services will be the future. Pure Garbage!

The design of systems to use the Web is only becoming WORSE than before.
The mainframe mentality of the Web, is only being surpassed by the idiotic idea of
using the most inefficient, unreliable means possible to communicate and design systems.

Let's not forget! Longhorn is in the works, and that will only talk to Longhorn! LOL!

As (I) predicted many years ago, the bandwidth of the Web is ALREADY wide enough to download VERY sizable FAT-Client programs, and run them on PC's (which is the NEW, HEW, NEW Model that you will be hearing about after they sell you on it).

The methods for the development of programs today, including URL, are quite unimpressive.
The thinking behind much of it, hasn't changed significantly during the past 20 years, and whereas everyone thought the Web was such a new thing, it is simple denial to think that
Web design hasn't been settup so far, just like a Mainframe toalking to a bunch of REALLY DUMB PC's, ... Next comes Services... Just to sell more junk and make for another unstable and even WORSE model, because the folks making software are going to sell everyone on the idea that Web Services represent THE WAY to REUSE software and get your systems up and running REAL fast, despite the fact that the systems will also be REAL SLOW, and REAL UNRELIABLE, too. And in the not too distant future is ARE ALREADY signs that THE FAT-CLIENT
WILL BE BACK! (As i predicted.)

GEEEEZZZZZ! Watta field!
  • #31
I haven’t seen a stance which starts from a low level perspective.

The main corporate problems revolve around code maintenance and debugging leaving very little impetus for a language change.

The compiler allows abstraction of the computers hardware: instruction sets, cache alignment, memory bus contentions, and the interface to every possible piece of hardware. Yes the ultimate abstraction is to be able to use natural language to tell the computer what to do and have it done… eventually.

I believe that is neither realistic nor a desirable goal. It is possible, and really it just sidesteps the issue –what do you write the compiler in?-

It is not the language which needs the greatest change but the development environment.
Compilers presently perform very little communication to the user. For debugging purposes one needs an abstraction to ease conceptualization of what the software is accomplishing. For optimization purposes one needs the code shown as the computer will see it. A tremendous amount of screen estate and idle processing power is available to currently handle these tasks but they remain ignored.

To selfAdjoint:
These languages can handle parallelism; it is up to the compiler to implement independent threads for each parallel operation and for memory access to be dealt with on the hardware side. The main problem is that many algorithms can not be improved by parallel processing; the output of one stage is required for the next.

To StarkyDe:
Despite biological self-replications ability to produce enormous parallelism, its chemical thermodynamics require equilibrium stages and therefore will be individually much slower than their solid state equivalent. (see above)

To: dduardo:
Custom macros or procedures in libraries are useful abstractions; although they may interfere with understanding. Many languages are defined by their standardized libraries, but their will always be the need for new task specific tools. The problem which causes such chaotic use is that everyone holds copyright on their libraries preventing anyone else from using the same code unless authorized. Even General Public License code will not be used by your employer if they are trying to sell their product.
  • #32
It's not "what is going to be the next programming language or method" but what is going to be the standard software engineering process as opposed to just hacking. The trend is toward graphic representation of software and standardized components. Just look at the electronics industry with electronic schematics and standardized components! Imagine software written by use of schematics and standard components, how much easier it would be to document, design and maintain.

A picture speaks a thousand words and despite what many believe about vocal interfaces being the natural man machine interface they're really not. Sound is a serialized process in our brains and doesn't exploit our parallel processing capability as does visualization. So the real bottle neck is inputing information to the computer from human beings. There are military and even some consumer products that can use eye movement or even actual brain tissue interfaces. So we have good graphic interfaces we need better input interfaces to become more productive with machines.

The future is software schematics utilizing OOP and off the shelf components. A good source for learning more is the OMG group that promotes UML.
Last edited:
  • #33
Many posters have written here about standardization, and self-adjoint makes an excellent point about multiprocessing.

You can debate whether greater standardization is desirable because it makes programmers jobs easier, and whether it stimulates new development by letting them concentrate more on end results without being distracted so much by details, or stifles new development by locking them into "old" ways of thinking.

And it will be interesting to see to what extent programming will embrace multiprocessing, and, conversely, to what extent we will simply rely on the next generation of (single) processors to relieve us of having to make that extra effort.

Much can be said about these topics but neither is a "method of programming" in the same sense that procedural, event and OOP are. Procedural languages can be standardized or non-standardized. Procedural-based programs can be designed to take advantage of multiprocessing, or not. The same can be said about event-driven, or object-oriented programs.

What do y'all think of declarative programming

as a candidate for the "next" programming paradigm?
  • #34
I was just reading a short tutorial in declarative programming last week, via slashdot. It looks like it has some good features, including the ever marketed modular capability (you wouldn't want to know how far back I go with that idea. It's always the big selling point of any new method. Reuseable code! Yeah!)

I always ask, how does it go with data base, with massaging and updating huge amounts of stored data? IIRC, that's what hobbled OOP.
  • #35
Speaking as a novice: is oop "hobbled"?

Do you still have a link to that declarative programming tutorial? What little I've seen about it seems intriguing; I'd like to read more.

Suggested for: The next programming method?