What is there left to advance in the field of computer?

  • Thread starter Thread starter GreatEscapist
  • Start date Start date
  • Tags Tags
    Computer Field
Click For Summary
SUMMARY

The discussion centers on the perceived stagnation in computer and software engineering advancements, with participants highlighting the importance of parallel processing, artificial intelligence, and the need for better algorithms. Key areas identified for future exploration include the development of unified programming languages for robotics, improvements in graphics engines, and tackling challenges in quantum computing. The conversation emphasizes the necessity for innovation in algorithms and hardware to address complex problems such as climate modeling and biochemistry simulations.

PREREQUISITES
  • Understanding of parallel processing techniques
  • Familiarity with artificial intelligence concepts
  • Knowledge of quantum computing fundamentals
  • Basic programming skills, particularly in languages like C#
NEXT STEPS
  • Research advancements in quantum computing technologies
  • Explore the development of unified programming languages for robotics
  • Investigate improved algorithms for complex simulations, such as climate modeling
  • Learn about the latest trends in graphics engine development for gaming
USEFUL FOR

Computer science students, software engineers, hardware developers, and anyone interested in the future of technology and innovation in computing.

GreatEscapist
Messages
178
Reaction score
0
I'm going into computer and software engineering.
The problem is, I feel like there isn't anything new left to do. Ten years ago, there was much left to be done. Now it just feels like "Who can make the fanciest computer" kind of deal. Nothing really monstrous to cover. In the med field, we still have cancer and stuff to cure.

What's left here?
 
Computer science news on Phys.org
Parallel processing is a big deal, dude.
 
ZoomStreak said:
Parallel processing is a big deal, dude.

Yeah, but that's just for comfort- it's not something that really needs to be worked on. Like ten years ago.

Does that make sense?
 
Some people going into the field ten years ago must have said the same thing.

Do you think compter technology will look the way it does now in a hundred years?

The question you should be asking is "what will computers be like in 30 years" followed by "what can I do to make it happen in 15?"

I assure you that you will look back at this question in 20 years and marvel at how you could have thought such a thing.

Now go to work inventing the future and be snappy about it!
 
Antiphon said:
Some people going into the field ten years ago must have said the same thing.

Do you think compter technology will look the way it does now in a hundred years?

The question you should be asking is "what will computers be like in 30 years" followed by "what can I do to make it happen in 15?"

I assure you that you will look back at this question in 20 years and marvel at how you could have thought such a thing.

Now go to work inventing the future and be snappy about it!

I needed that.
Thanks. :smile:
 
For the software part, there's always some new application or improvement to an existing process. Some of this stuff is somewhat obscure and niche oriented, like encryption, compressoin, error correction codes, modeling of weather, ... . Others are more mainstream, like improving the graphics engines used in gaming software, or adding more feature creep aspects to the various generations of Windows.

On the hardware side, until they resolve the heat issue cause by voltage versus switch (transistor) size in order to significantly exceed 4ghz operation, the hardware guys are stuck with increasing the parallelism and path prediction in cpus'.
 
I remember reading that the field of robotics is in need a unified programming language. Artificial intelligence is also an ongoing task.
 
GreatEscapist said:
I'm going into computer and software engineering.
The problem is, I feel like there isn't anything new left to do. Ten years ago, there was much left to be done. Now it just feels like "Who can make the fanciest computer" kind of deal. Nothing really monstrous to cover. In the med field, we still have cancer and stuff to cure.

What's left here?

All these things you will have no idea in 2020 how you could live without in 2010?

More seriously - there are still huge tasks that need incredible number crunching power and/or better algorithms. Think simulation of weather (already mentioned), climate, fluid dynamics, biochemistry and so on.
 
I'm by no means an expert at computing, but from what I've seen better algorithms is a REALLY big deal.

Also, open source. Open source, my friend.
 
  • #10
I'd be willing to pay a vast sum of money to anyone who can develop a reinforced learning algorithm to allow me to track an equity index to within three sigma.
 
  • #11
A program that helps you make programs would be nice imo. If you can remove the need for having to learn a program language youd be on to something.
 
  • #12
magpies said:
A program that helps you make programs would be nice imo. If you can remove the need for having to learn a program language youd be on to something.

Well, not a new idea. Been around since the 1970's in fact, in the idea of "Fourth-Generation Languages". It hasn't really panned out, despite many many attempts. (I think they moved on to a 'fifth generation' now, too)
 
  • #13


I'm currently not smart enough to do that kind of stuff.

:frown:

I can barely use C#... Can't wait for college. Stupid, stupid, stupid high school. >_>
 
  • #14
For the hardware, quantum computer will be the future, but no one knows if it's theoretically possible. Realistically, the improvement on clock speed is getting difficult due to RC delay, clock skews, power, etc.. The next logical step I can think of is the chip that works completely asynchronous. I have seen a self-timed divider on some of the CPUs, but the same idea might be able to get extended to entire chip.
 
Last edited:
  • #15
What was going on ten years ago except trying to make chips that had more memory and were faster? I don't know what else you can do with a computer except increase capacity and speed. So, maybe that means coming up with another kind of material to use for processors besides silicon based stuff. I just saw a video where they incorporated a rat cortex into a computer, so that the cortex was essentially the processor and it learned how to fly a flight simulator, so live neurons are the future, I guess... but then "what is a computer?" that chip part or the chip and the cortex? I mean, if you invent a brain in the course of making more advanced computers, you've invented something that will go out and think, "gee, a computer sure would make this job easier..." I dunno.
 
  • #16
lostinxlation said:
quantum computer will be the future, but no one knows if it's theoretically possible

If it is not possible, it will be not the future, so you are contradicting yourself.
 

Similar threads

Replies
8
Views
4K
  • · Replies 4 ·
Replies
4
Views
10K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 40 ·
2
Replies
40
Views
4K
  • · Replies 45 ·
2
Replies
45
Views
13K
  • · Replies 18 ·
Replies
18
Views
7K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 54 ·
2
Replies
54
Views
12K
  • · Replies 12 ·
Replies
12
Views
4K