Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is there left to advance in the field of computer?

  1. May 29, 2010 #1
    I'm going into computer and software engineering.
    The problem is, I feel like there isn't anything new left to do. Ten years ago, there was much left to be done. Now it just feels like "Who can make the fanciest computer" kind of deal. Nothing really monstrous to cover. In the med field, we still have cancer and stuff to cure.

    What's left here?
     
  2. jcsd
  3. May 29, 2010 #2
    Parallel processing is a big deal, dude.
     
  4. May 29, 2010 #3
    Yeah, but that's just for comfort- it's not something that really needs to be worked on. Like ten years ago.

    Does that make sense?
     
  5. May 29, 2010 #4
    Some people going into the field ten years ago must have said the same thing.

    Do you think compter technology will look the way it does now in a hundred years?

    The question you should be asking is "what will computers be like in 30 years" followed by "what can I do to make it happen in 15?"

    I assure you that you will look back at this question in 20 years and marvel at how you could have thought such a thing.

    Now go to work inventing the future and be snappy about it!
     
  6. May 29, 2010 #5
    I needed that.
    Thanks. :smile:
     
  7. May 30, 2010 #6

    rcgldr

    User Avatar
    Homework Helper

    For the software part, there's always some new application or improvement to an existing process. Some of this stuff is somewhat obscure and niche oriented, like encryption, compressoin, error correction codes, modeling of weather, ... . Others are more mainstream, like improving the graphics engines used in gaming software, or adding more feature creep aspects to the various generations of Windows.

    On the hardware side, until they resolve the heat issue cause by voltage versus switch (transistor) size in order to significantly exceed 4ghz operation, the hardware guys are stuck with increasing the parallelism and path prediction in cpus'.
     
  8. May 31, 2010 #7

    Borg

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I remember reading that the field of robotics is in need a unified programming language. Artificial intelligence is also an ongoing task.
     
  9. May 31, 2010 #8

    Borek

    User Avatar

    Staff: Mentor

    All these things you will have no idea in 2020 how you could live without in 2010?

    More seriously - there are still huge tasks that need incredible number crunching power and/or better algorithms. Think simulation of weather (already mentioned), climate, fluid dynamics, biochemistry and so on.
     
  10. May 31, 2010 #9
    I'm by no means an expert at computing, but from what I've seen better algorithms is a REALLY big deal.

    Also, open source. Open source, my friend.
     
  11. May 31, 2010 #10
    I'd be willing to pay a vast sum of money to anyone who can develop a reinforced learning algorithm to allow me to track an equity index to within three sigma.
     
  12. May 31, 2010 #11
    A program that helps you make programs would be nice imo. If you can remove the need for having to learn a program language youd be on to something.
     
  13. May 31, 2010 #12

    alxm

    User Avatar
    Science Advisor

    Well, not a new idea. Been around since the 1970's in fact, in the idea of "Fourth-Generation Languages". It hasn't really panned out, despite many many attempts. (I think they moved on to a 'fifth generation' now, too)
     
  14. May 31, 2010 #13
    Re: What is there left to advance in the field of computers?

    I'm currently not smart enough to do that kind of stuff.

    :frown:

    I can barely use C#... Can't wait for college. Stupid, stupid, stupid high school. >_>
     
  15. Jun 2, 2010 #14
    For the hardware, quantum computer will be the future, but no one knows if it's theoretically possible. Realistically, the improvement on clock speed is getting difficult due to RC delay, clock skews, power, etc.. The next logical step I can think of is the chip that works completely asynchronous. I have seen a self-timed divider on some of the CPUs, but the same idea might be able to get extended to entire chip.
     
    Last edited: Jun 2, 2010
  16. Jun 9, 2010 #15
    What was going on ten years ago except trying to make chips that had more memory and were faster? I don't know what else you can do with a computer except increase capacity and speed. So, maybe that means coming up with another kind of material to use for processors besides silicon based stuff. I just saw a video where they incorporated a rat cortex into a computer, so that the cortex was essentially the processor and it learned how to fly a flight simulator, so live neurons are the future, I guess... but then "what is a computer?" that chip part or the chip and the cortex? I mean, if you invent a brain in the course of making more advanced computers, you've invented something that will go out and think, "gee, a computer sure would make this job easier..." I dunno.
     
  17. Jun 10, 2010 #16

    Borek

    User Avatar

    Staff: Mentor

    If it is not possible, it will be not the future, so you are contradicting yourself.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook