DNA Computing: What Will Happen with C++?

In summary, the trends are that programming languages will become more problem-orientated, and that programmers will become jobless in the future.
  • #1
Agisch
7
0
Link: http://research.microsoft.com/en-us/projects/dna/

As I was experimenting around with that simulator, and realized that the language have some similarities with the assembly language, I asked myself: "Are programming languages such as C++ (or at least some of its concepts) will still survive after the post-silicon era?".

So all my knowledge about writing a "recipe" for a computer program named compiler is going to be useless.
 
Technology news on Phys.org
  • #2
Even if C++ dies, programming languages all have similar syntax and all require the same logic so your skills are highly transferable.
 
  • #3
Your link points to a very specific form of computing used in and for DNA analysis. What does that have to do with whether or not other specific languages exist?

What is a "post silicon" era? I don't see one coming any time soon, whatEVER it is you mean.

What is a "computer program named compiler" ? This seems to be just a string of words that don't hang together.
 
Last edited:
  • #4
phinds said:
Your link points to a very specific form of computing used in and for DNA analysis. What does that have to do with whether or not other specific languages exist?

What is a "post silicon" era? I don't see one coming any time soon, whatEVER is is you mean.

What is a "computer program named compiler" ? This seems to be just a string of words that don't hang together.

First, I didn't say anything that this simulator has necessarily something to do with programming languages.

Second, that simulation was just prompting me to think that C++ will not remain in future (so, it was just an instigation).

Third, the term "post silicon" refers to the age where no silicon based computers are used.

EDIT: A compiler is a computer program in my opinion, thus they hang together.
 
Last edited:
  • #5
The semiconductor industry has been struggling for ten years to overcome the obstacles presented by multicore processing and the next generation will use hardware accelerated transactional memory. This is a sort of supercomputer approach to programming where the computer does more of the work deciding what hardware to use and, ideally, the computer could incorporate neuromorphic circuitry to allow it to make even more complex decisions. That way the programmer doesn't have to worry too much about the hardware and programming languages can be written for the convenience of the programmers themselves.
 
  • #6
So all my knowledge about writing a "recipe" for a computer program named compiler is going to be useless.
Give youself a timeframe for "useless" - 5, 10 20, 50 years.
 
  • #7
wuliheron said:
The semiconductor industry has been struggling for ten years to overcome the obstacles presented by multicore processing and the next generation will use hardware accelerated transactional memory. This is a sort of supercomputer approach to programming where the computer does more of the work deciding what hardware to use and, ideally, the computer could incorporate neuromorphic circuitry to allow it to make even more complex decisions. That way the programmer doesn't have to worry too much about the hardware and programming languages can be written for the convenience of the programmers themselves.

Let me get this right, if computers are more and more becoming self-organizing or intelligent systems, then programming languages will have to be designed increasingly problem-oriented, right?
As a consequence, programming languages will be problem-oriented or abstract as never before, and programmers will might become jobless some day, because through higher abstraction it could get more intuitive for computer users.

But I agree with 256bits, these are unnecessary as well as time wasting thoughts. It won't happen in near future, perhaps.
 
Last edited:
  • #8
But I agree with 256bits, these are unnecessary as well as time wasting thoughts. It won't happen in near future, perhaps

I would say it is worthwhile to think about how trends may play out in the future, but the farther one tries to take the possibilities the more unpredicable they become.
 
  • #9
There is a clear trend in programming away from simple application programming whereas in the early days programmers used COBOL to write custom payroll applications and other business applications that now are available as canned software products. Many one-off programs are now replaced with spreadsheets.

Programmers are still needed but now the work is more challenging you must write the tools for non-programmers to use (word processors, spreadsheets...) or the tools for programmers (IDEs like Eclipse) or the middleware used to connect heterogeneous computer systems together.

So that while C++ may not drop away, you will need to know a collection of languages to get the job done.
My feeling on DNA computers is that our current computer technology will interface with and control the DNA computer awaiting results much as the IBM CELL processor and its like use the satellite CPUs for deep algorithmic computations all controlled by a general CPU that waits on answers from them.
 
  • #10
@jedishrfu:

Yes, through higher abstraction programming could get more sophisticated than ever, but that wouldn't make sense at all (at least for the consumer).
(Why keep it complex when you can do it the easy way? Because the easy way is to complex to design or to create, respectively? But when you already designed/created a system which is easy to use, and which does the complex stuff for you - black box - then you would rather built the "helping system" to create the sophisticated ones, right? I hope you can follow me despite my weird multi-clause sentences.)

Some examples:

- If you're comparing today's technology with say from technology in the seventies, you may noticed that it is constantly becoming intuitive/trivial for the masses. (Compare the user interface technologies/designs for instance: GUI and CLI).

- I once programmed in assembly language (simple microcontrollers), and I realized that you need to know lots of stuff about hardware and electrical engineering for trivial operations. (And yes, even in today's world you need to now how the machine/CPU/GPU works to get the most out of it.)

@wuliheron && D H:

Hmmm... automation is an intermediate result, even if it's bitter for those struggling people.

Out of topic (you can read it, but you can likewise skip it without missing anything):

I think this topic is actually a more general issue. The inner workings of a system are usually complex, but the user interface - the interface or tool which alows you to interact with that system in an easy manner - makes it possible to everyone to ignore those inner workings. As a consequence, we are continually recededing away from the concrete (say transistors) to the abstract (say ICs). So the question is how much complexity is enough, because complexity causes by force dependence. (What if some of the knowledge is lost, and the system got so complex that it's hard to do reverse engineering? Can we deal with EXTREME complexity at all?)

I'm back on track:
But anyway I think programming languages such as C++ will remain at least decades but not centuries. (Then I am almost more than a half of a century old, so why should I now worry about that? C is almost fourty, and it's still in use. And even if it dies, many things mankind develops is based on something. I changed my mind, finally.)
 

1. What is DNA computing?

DNA computing is a form of computing that uses DNA molecules as a medium for storing and processing data. It involves using the principles of biochemistry and molecular biology to manipulate and process information.

2. How does DNA computing work?

DNA computing works by using strands of DNA as the basic unit of information storage. These strands can be manipulated and combined with enzymes and other molecules to perform calculations and store data. The process involves encoding data into DNA sequences and then using specific techniques to manipulate and read the data.

3. What is the role of C++ in DNA computing?

C++ is a programming language that is often used in DNA computing. It allows scientists to write algorithms and programs that can manipulate DNA sequences and perform calculations. C++ is a useful language for DNA computing because it is fast, efficient, and has a variety of built-in functions and libraries that are helpful for working with DNA.

4. What are the potential applications of DNA computing?

DNA computing has the potential to revolutionize fields such as medicine, data storage, and cryptography. It could be used to create more efficient and powerful computers, develop personalized medicine treatments based on an individual's DNA, and create highly secure data storage systems.

5. What are the challenges facing DNA computing?

One of the main challenges facing DNA computing is the high cost and complexity of the technology. It also requires specialized knowledge and equipment, making it less accessible to the general public. Additionally, there are ethical considerations surrounding the use of DNA, as well as potential safety risks associated with manipulating genetic material.

Similar threads

  • Programming and Computer Science
Replies
7
Views
649
Replies
6
Views
1K
  • Programming and Computer Science
Replies
11
Views
1K
  • Programming and Computer Science
4
Replies
122
Views
13K
  • Programming and Computer Science
Replies
29
Views
3K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Programming and Computer Science
Replies
1
Views
3K
  • Programming and Computer Science
Replies
17
Views
4K
  • Programming and Computer Science
Replies
30
Views
4K
Back
Top