What is there left to advance in the field of computer?

  • Thread starter Thread starter GreatEscapist
  • Start date Start date
  • Tags Tags
    Computer Field
AI Thread Summary
The discussion centers around the perceived stagnation in computer and software engineering, with concerns that the field has become focused on superficial advancements rather than groundbreaking innovations. Participants express skepticism about the future of technology, questioning what significant developments remain. Key areas identified for future work include parallel processing, algorithm improvement, and the need for a unified programming language in robotics. The conversation highlights ongoing challenges in hardware, such as heat management and the limitations of current transistor technology, while also mentioning the potential of quantum computing and alternative materials for processors. The importance of open-source projects and the development of user-friendly programming tools is emphasized, alongside the need for advanced algorithms in various fields like climate modeling and biochemistry. Overall, the dialogue suggests that while the landscape may seem saturated, there are still vast opportunities for innovation and problem-solving in computing.
GreatEscapist
Messages
178
Reaction score
0
I'm going into computer and software engineering.
The problem is, I feel like there isn't anything new left to do. Ten years ago, there was much left to be done. Now it just feels like "Who can make the fanciest computer" kind of deal. Nothing really monstrous to cover. In the med field, we still have cancer and stuff to cure.

What's left here?
 
Computer science news on Phys.org
Parallel processing is a big deal, dude.
 
ZoomStreak said:
Parallel processing is a big deal, dude.

Yeah, but that's just for comfort- it's not something that really needs to be worked on. Like ten years ago.

Does that make sense?
 
Some people going into the field ten years ago must have said the same thing.

Do you think compter technology will look the way it does now in a hundred years?

The question you should be asking is "what will computers be like in 30 years" followed by "what can I do to make it happen in 15?"

I assure you that you will look back at this question in 20 years and marvel at how you could have thought such a thing.

Now go to work inventing the future and be snappy about it!
 
Antiphon said:
Some people going into the field ten years ago must have said the same thing.

Do you think compter technology will look the way it does now in a hundred years?

The question you should be asking is "what will computers be like in 30 years" followed by "what can I do to make it happen in 15?"

I assure you that you will look back at this question in 20 years and marvel at how you could have thought such a thing.

Now go to work inventing the future and be snappy about it!

I needed that.
Thanks. :smile:
 
For the software part, there's always some new application or improvement to an existing process. Some of this stuff is somewhat obscure and niche oriented, like encryption, compressoin, error correction codes, modeling of weather, ... . Others are more mainstream, like improving the graphics engines used in gaming software, or adding more feature creep aspects to the various generations of Windows.

On the hardware side, until they resolve the heat issue cause by voltage versus switch (transistor) size in order to significantly exceed 4ghz operation, the hardware guys are stuck with increasing the parallelism and path prediction in cpus'.
 
I remember reading that the field of robotics is in need a unified programming language. Artificial intelligence is also an ongoing task.
 
GreatEscapist said:
I'm going into computer and software engineering.
The problem is, I feel like there isn't anything new left to do. Ten years ago, there was much left to be done. Now it just feels like "Who can make the fanciest computer" kind of deal. Nothing really monstrous to cover. In the med field, we still have cancer and stuff to cure.

What's left here?

All these things you will have no idea in 2020 how you could live without in 2010?

More seriously - there are still huge tasks that need incredible number crunching power and/or better algorithms. Think simulation of weather (already mentioned), climate, fluid dynamics, biochemistry and so on.
 
I'm by no means an expert at computing, but from what I've seen better algorithms is a REALLY big deal.

Also, open source. Open source, my friend.
 
  • #10
I'd be willing to pay a vast sum of money to anyone who can develop a reinforced learning algorithm to allow me to track an equity index to within three sigma.
 
  • #11
A program that helps you make programs would be nice imo. If you can remove the need for having to learn a program language youd be on to something.
 
  • #12
magpies said:
A program that helps you make programs would be nice imo. If you can remove the need for having to learn a program language youd be on to something.

Well, not a new idea. Been around since the 1970's in fact, in the idea of "Fourth-Generation Languages". It hasn't really panned out, despite many many attempts. (I think they moved on to a 'fifth generation' now, too)
 
  • #13


I'm currently not smart enough to do that kind of stuff.

:frown:

I can barely use C#... Can't wait for college. Stupid, stupid, stupid high school. >_>
 
  • #14
For the hardware, quantum computer will be the future, but no one knows if it's theoretically possible. Realistically, the improvement on clock speed is getting difficult due to RC delay, clock skews, power, etc.. The next logical step I can think of is the chip that works completely asynchronous. I have seen a self-timed divider on some of the CPUs, but the same idea might be able to get extended to entire chip.
 
Last edited:
  • #15
What was going on ten years ago except trying to make chips that had more memory and were faster? I don't know what else you can do with a computer except increase capacity and speed. So, maybe that means coming up with another kind of material to use for processors besides silicon based stuff. I just saw a video where they incorporated a rat cortex into a computer, so that the cortex was essentially the processor and it learned how to fly a flight simulator, so live neurons are the future, I guess... but then "what is a computer?" that chip part or the chip and the cortex? I mean, if you invent a brain in the course of making more advanced computers, you've invented something that will go out and think, "gee, a computer sure would make this job easier..." I dunno.
 
  • #16
lostinxlation said:
quantum computer will be the future, but no one knows if it's theoretically possible

If it is not possible, it will be not the future, so you are contradicting yourself.
 
Back
Top