1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is majoring in computer engineering stupid?

  1. Apr 9, 2013 #1
    We're hitting the silicon barrier so we might jump to a new type of computing, right?

    So would my knowledge all go to waste if we move to optical computing for example?
     
  2. jcsd
  3. Apr 9, 2013 #2
    I don't think it's ever a waste, necessarily. There may still be some things to discover in silicon based computers, but I do agree that the optical computing may be more lucrative, or maybe developing the quantum computer would be a good way to go if you have the dedication.
     
  4. Apr 9, 2013 #3

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Whatever you study, some of your knowledge will get out of date or irrelevant if you work for say 30 years before you retire. So don't worry about it. Everybody else is in the same situation.

    Of course if you have superpowers and you can really predict the future, that would be different :smile:
     
  5. Apr 9, 2013 #4

    SteamKing

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    It's only stupid if you were planning to work in planar IC design exclusively. I think computer engineering involves more than that. While newer technologies might appear to be on the horizon, who is to say if these technologies will ever be mature enough to replace silicon?
     
  6. Apr 9, 2013 #5
    Planar IC design isn't going anywhere for the foreseeable future. Most 3D technologies are looking to connect planar ICs with through silicon vias or multi-wafer bonding. The skills will still be used.

    Besides, computer engineering is more abstract that silicon engineering. The concepts are valid whether we are using silicon, light, bacteria, or an abacus for our computation medium.

    Majoring in computer engineering isn't dumb at all. If you look at an introductory computer textbook from the 1960s you'll see that it is still mostly relevant today (some things are obsolete) even though it is 50 years old. The basics don't change much... the details do.
     
  7. Apr 9, 2013 #6
    Math knowledge doesn't become out of date.
     
  8. Apr 9, 2013 #7
    Sure it does. At the very least it can become irrelevant.

    80 years ago a lot of engineering and math training involved hand calculating approximate solutions to differential equations. Not a very useful skill these days.

    If you were an expert in that during the 60s and 70s when computers started taking over that role (and enabled much more accurate iterative solutions to diff eqs) you would have to update your skills otherwise *you* would become out of date.

    Math is such a big field it is impossible to know all of it. The right tool for the job changes over time because the pressing problems change.
     
  9. Apr 9, 2013 #8
    Thank you all.
     
  10. Apr 9, 2013 #9

    phyzguy

    User Avatar
    Science Advisor

    Right, I still use my slide rule every day (NOT!)
     
  11. Apr 9, 2013 #10

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Some of the math methods that I use every day at work hadn't even been invented when I was at university.

    Hmm... computer science in the 1960s is going back to before classics like "Dijkstra, Hoare, and Dahl" (which shouldn't need the book title, any more than "K&R" does). But the principle of what you are saying is right, even if the timeline is a bit off.
     
  12. Apr 9, 2013 #11
    I was referring to computer architecture rather than software (based on the OP's question). Most of the earth-shattering developments in hardware design, for example the IAS Von Neumann computer, the Harvard architecture, Stretch (vector processing that led to the Cray), Superscalar architecture (CDC 6600) and the IBM 360 were built by the mid-60s. With the exception of RISC (and a few others of course) I think most of the big paradigm shifts in computer hardware had happened by 1970.

    For software I agree with you. Interesting stuff, at any rate.
     
  13. Apr 9, 2013 #12

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    And RISC was 1975.
     
  14. Apr 9, 2013 #13
    Never a waste. You may not use everything you learned in school when you get into the work-force. Hell you may not remember much from school either. I think the whole point of a major is to get some exposure and demonstrate that you're capable of learning from a future employer.

    Also they're always looking for people with experience in the 'old'. I know some companies still rely on old machines/old languages, and the freshly trained college graduate generally doesn't have exposure to that mess.

    Now I've always felt that an EE major is always a better route than a CS major, but that doesn't mean a CS major is useless.
     
  15. Apr 10, 2013 #14
    Very few computer engineers need to know the specific details of the underlying process that implements the circuit elements. For the most part, designing an optical computer would be very similar to designing a silicon one, until you got to the point of putting actual circuits together.

    Now if we move to quantum computing, *that* would be more of a paradigm shift in how to design computers.

    I wouldn't worry about the entire field becoming obsolete, although you certainly have to work to keep your skills up to date.
     
  16. Apr 11, 2013 #15
    Using a calculator, slide rule, computer or paper and pencil shouldn't affect your knowledge of math.

    Can you give an example?
     
  17. Apr 11, 2013 #16

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

  18. Apr 11, 2013 #17
    He's talking about knowing math, not random techniques for solving math. Differential equations may have been solved by hand in the past, and that has changed, but the knowledge of differential equations is still very useful today. Similarly with logarithms, we used to have to find values of logarithms by hand, back when calculators weren't around, but logarithms have been and will continue to be important.

    Math involves absolute truths; the truth can't be outdated.
     
  19. Apr 11, 2013 #18
    And when new truths are discovered, do people just know them without going to school?

    Just because we might switch to optical computing doesn't mean that all my knowledge on silicon suddenly becomes wrong. It just means that I have to go to school to learn about optical computing just like a mathematician would have to go to school to learn new math.
     
  20. Apr 11, 2013 #19
    But the axioms that said "truths" are a function of can be changed or dropped all together.
     
  21. Apr 11, 2013 #20
    Spare me the "absolute truth" stuff. The OP was interested if a degree in computer engineering would still be relevant if there are significant technological changes in the future. An engineering degree can only teach you so much, and there really isn't time to teach EVERYTHING. Of course differential equations are still useful, but the many, many hours spent in the 1980s learning tricks to solve them by hand is obsolete. A trick to solve an equation by hand is most certainly obsolete. These tricks are now out of date. QED. It would have been better to spend the time learning more fundamentals, but what is fundamental and what is a time-saving trick isn't always clear except in hindsight.

    All in all, the vast majority of an engineering degree from a few decades ago is still germane. Some of it, though, is obsolete.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook