Register to reply

DNA Computing: What Will Happen with C++?

Share this thread:
Agisch
#1
Dec27-12, 05:38 AM
P: 7
Link: http://research.microsoft.com/en-us/projects/dna/

As I was experimenting around with that simulator, and realized that the language have some similarities with the assembly language, I asked myself: "Are programming languages such as C++ (or at least some of its concepts) will still survive after the post-silicon era?".

So all my knowledge about writing a "recipe" for a computer program named compiler is going to be useless.
Phys.Org News Partner Science news on Phys.org
'Office life' of bacteria may be their weak spot
Lunar explorers will walk at higher speeds than thought
Philips introduces BlueTouch, PulseRelief control for pain relief
tahayassen
#2
Dec27-12, 07:14 AM
P: 273
Even if C++ dies, programming languages all have similar syntax and all require the same logic so your skills are highly transferable.
phinds
#3
Dec27-12, 07:57 AM
PF Gold
phinds's Avatar
P: 6,511
Your link points to a very specific form of computing used in and for DNA analysis. What does that have to do with whether or not other specific languages exist?

What is a "post silicon" era? I don't see one coming any time soon, whatEVER it is you mean.

What is a "computer program named compiler" ??? This seems to be just a string of words that don't hang together.

Agisch
#4
Dec27-12, 08:18 AM
P: 7
DNA Computing: What Will Happen with C++?

Quote Quote by phinds View Post
Your link points to a very specific form of computing used in and for DNA analysis. What does that have to do with whether or not other specific languages exist?

What is a "post silicon" era? I don't see one coming any time soon, whatEVER is is you mean.

What is a "computer program named compiler" ??? This seems to be just a string of words that don't hang together.
First, I didn't say anything that this simulator has necessarily something to do with programming languages.

Second, that simulation was just prompting me to think that C++ will not remain in future (so, it was just an instigation).

Third, the term "post silicon" refers to the age where no silicon based computers are used.

EDIT: A compiler is a computer program in my opinion, thus they hang together.
wuliheron
#5
Dec27-12, 10:48 AM
P: 1,967
The semiconductor industry has been struggling for ten years to overcome the obstacles presented by multicore processing and the next generation will use hardware accelerated transactional memory. This is a sort of supercomputer approach to programming where the computer does more of the work deciding what hardware to use and, ideally, the computer could incorporate neuromorphic circuitry to allow it to make even more complex decisions. That way the programmer doesn't have to worry too much about the hardware and programming languages can be written for the convenience of the programmers themselves.
256bits
#6
Dec27-12, 11:53 AM
P: 1,499
So all my knowledge about writing a "recipe" for a computer program named compiler is going to be useless.
Give youself a timeframe for "useless" - 5, 10 20, 50 years.
Agisch
#7
Dec27-12, 12:18 PM
P: 7
Quote Quote by wuliheron View Post
The semiconductor industry has been struggling for ten years to overcome the obstacles presented by multicore processing and the next generation will use hardware accelerated transactional memory. This is a sort of supercomputer approach to programming where the computer does more of the work deciding what hardware to use and, ideally, the computer could incorporate neuromorphic circuitry to allow it to make even more complex decisions. That way the programmer doesn't have to worry too much about the hardware and programming languages can be written for the convenience of the programmers themselves.
Let me get this right, if computers are more and more becoming self-organizing or intelligent systems, then programming languages will have to be designed increasingly problem-oriented, right?
As a consequence, programming languages will be problem-oriented or abstract as never before, and programmers will might become jobless some day, because through higher abstraction it could get more intuitive for computer users.

But I agree with 256bits, these are unnecessary as well as time wasting thoughts. It won't happen in near future, perhaps.
256bits
#8
Dec27-12, 07:44 PM
P: 1,499
But I agree with 256bits, these are unnecessary as well as time wasting thoughts. It won't happen in near future, perhaps
I would say it is worthwhile to think about how trends may play out in the future, but the farther one tries to take the possibilities the more unpredicable they become.
jedishrfu
#9
Dec28-12, 01:06 AM
P: 3,100
There is a clear trend in programming away from simple application programming whereas in the early days programmers used COBOL to write custom payroll applications and other business applications that now are available as canned software products. Many one-off programs are now replaced with spreadsheets.

Programmers are still needed but now the work is more challenging you must write the tools for non-programmers to use (word processors, spreadsheets...) or the tools for programmers (IDEs like Eclipse) or the middleware used to connect heterogeneous computer systems together.

So that while C++ may not drop away, you will need to know a collection of languages to get the job done.
My feeling on DNA computers is that our current computer technology will interface with and control the DNA computer awaiting results much as the IBM CELL processor and its like use the satellite CPUs for deep algorithmic computations all controlled by a general CPU that waits on answers from them.
Agisch
#10
Dec28-12, 07:04 AM
P: 7
@jedishrfu:

Yes, through higher abstraction programming could get more sophisticated than ever, but that wouldn't make sense at all (at least for the consumer).
(Why keep it complex when you can do it the easy way? Because the easy way is to complex to design or to create, respectively? But when you already designed/created a system which is easy to use, and which does the complex stuff for you - black box - then you would rather built the "helping system" to create the sophisticated ones, right? I hope you can follow me despite my weird multi-clause sentences.)

Some examples:

- If you're comparing today's technology with say from technology in the seventies, you may noticed that it is constantly becoming intuitive/trivial for the masses. (Compare the user interface technologies/designs for instance: GUI and CLI).

- I once programmed in assembly language (simple microcontrollers), and I realized that you need to know lots of stuff about hardware and electrical engineering for trivial operations. (And yes, even in today's world you need to now how the machine/CPU/GPU works to get the most out of it.)

@wuliheron && D H:

Hmmm... automation is an intermediate result, even if it's bitter for those struggling people.

Out of topic (you can read it, but you can likewise skip it without missing anything):

I think this topic is actually a more general issue. The inner workings of a system are usually complex, but the user interface - the interface or tool which alows you to interact with that system in an easy manner - makes it possible to everyone to ignore those inner workings. As a consequence, we are continually recededing away from the concrete (say transistors) to the abstract (say ICs). So the question is how much complexity is enough, because complexity causes by force dependence. (What if some of the knowledge is lost, and the system got so complex that it's hard to do reverse engineering? Can we deal with EXTREME complexity at all?)

I'm back on track:
But anyway I think programming languages such as C++ will remain at least decades but not centuries. (Then I am almost more than a half of a century old, so why should I now worry about that? C is almost fourty, and it's still in use. And even if it dies, many things mankind develops is based on something. I changed my mind, finally.)


Register to reply

Related Discussions
The Quantum Universe - Everything that Can Happen Will Happen (Cox, Forshaw) Science & Math Textbooks 0
Parallel universe : Anything that can happen will happen... Beyond the Standard Model 1
Optical Computing: special issue - Natural Computing, Springer General Physics 0
Optical Computing: special issue - Natural Computing, Springer General Physics 0
Optical Computing: special issue - Natural Computing, Springer General Physics 0