Computer science major = waste of youth?

  • #1
218
9
I'm a computer science major, and i feel like i'm wasting my youth learning things that soon will no longer be useful. IT is evolving at such an explosive rate that what i learn today may change tomorrow. For example, the textbooks we use right now would be of no use to us in a decade, but the engineering majors (and all other majors) may use their textbooks decades from now. Heck, to save money, one could actually use engineering books from the 1960's.
Is there merit in this way of thinking? I feel depressed that i'm wasting my youth accruing skills/knowledge that will soon no longer be used, either professionally, economically, personally, or otherwise.
I know that we should be learning constantly during our professional lives, but unlike the engineers, whose post-graduation learning would better their understanding of the physical world in the process, learning computer science post graduation requires one to learn arbitrary changes in arbitrary things: manmade programming languages.

Does this way of thinking seem to be a sign that computer science is not the right major for me? Does engineering seem like a better fit?
 
Last edited:

Answers and Replies

  • #2
18,363
8,215
I understand the pressure of a rapidly evolving field, however I think your perspective is off. First off it's not progressing as fast as you make it seem. Computer Science textbooks from 2005 would still be very relevant and useful. Even the "new kid on the block" Ruby language was around then.

For the most part undergrad work is designed to teach you how to think rather than what to think. Once you build foundations, the various related technologies are more easily grasped and build off each other. For example, take programming languages. Yes there are dozens of relevant languages and more coming out each year, but a person with a background in a primary language like C, Java, Python etc etc will be very able to learn a different language because they all do the same thing, just in different ways. I had a friend in college who had decent experience with C and he literally learned PHP over a weekend.

If you are not excited about learning new technologies then perhaps Computer Science is not for you, but I wouldn't panic that it's progressing at a rate that is impossible for you to succeed. If you enjoy it, do it and stop worrying.
 
  • #3
jtbell
Mentor
15,735
3,892
Specific languages and technologies come and go. Fundamental stuff like algorithms, data structures, and good practices in software engineering, stick around and evolve much more slowly.
 
  • Like
Likes Greg Bernhardt
  • #4
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
26,429
9,936
Jtbell is absolutely correct. However, there are colleges that have CS programs that don't teach these - it's a laundry list of languages, sprinkled with things that are trendy. ("Let's write an app!"). If you are in one of these programs, you might thing about finding a better one.
 
  • #5
jtbell
Mentor
15,735
3,892
And some programs probably front-load the trendy stuff at the beginning in order to attract students, then shift to the serious theory in upper-level classes. 10-15 years ago, many CS programs (including the AP CS courses) switched from C++ to Java for the introductory programming courses, because Java was seen as more "relevant" and marketable. Now it's robots and mobile apps. So if you're still near the beginning of your CS major, look at the upper-division courses you'll be taking, and see whether they're more of the same, or become more serious.
 
  • #6
WWGD
Science Advisor
Gold Member
5,421
3,691
Isn't CS something you can learn on your own for the most part? My CS firiends would chuckle every time I would
bring up getting a CS degree other than to just "have that piece of paper". A PC or Mac, a compiler, maybe one of those
Dummies books or other types of books. What else?

EDIT: I mean that you could get a degree in something else --maybe engineering--where you
will use programming -- and then get some sort of accreditation as a programmer to convince an
employer that you know how to program. Just a thought.
 
  • #7
18,363
8,215
Isn't CS something you can learn on your own for the most part?
I don't understand this. There are plenty of extremely advanced topics in CS just like any other science.

For the most part anyone can start working through a textbook in any subject. Schools are about discipline, connections and insight/support from staff.
 
  • Like
Likes ulianjay
  • #8
WWGD
Science Advisor
Gold Member
5,421
3,691
But
I don't understand this. There are plenty of extremely advanced topics in CS just like any other science.

For the most part anyone can start working through a textbook in any subject. Schools are about discipline, connections and insight/support from staff.
EDITTrue, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand, through the error messages) what is wrong with your code. And that is not the case in many other areas. There are also large support communities in CS, though also in other areas now, like, e.g., PF itself and the Stack Exchange sites.
 
  • #9
18,363
8,215
But

True, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand) what is wrong with your code. And that is not the case in many other areas.
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.
 
  • Like
Likes ulianjay
  • #10
WWGD
Science Advisor
Gold Member
5,421
3,691
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.
You're right, I guess I was thinking more about programming itself than about CS in general.
 
  • #11
donpacino
Gold Member
1,439
282
But


EDITTrue, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand, through the error messages) what is wrong with your code. And that is not the case in many other areas. There are also large support communities in CS, though also in other areas now, like, e.g., PF itself and the Stack Exchange sites.
It is not hard to write an algorithm.
It is hard to look at your end product, seeing that it is not working correctly and determining why.

Also when you get a job you will most likely not be able to post your problem online for the world to view
 
  • #12
analogdesign
Science Advisor
1,140
354
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.
For better or for worse, professional programming has changed a lot in the last 10 - 15 years. The days of people writing their own function libraries, database wrappers, threading code, and so on is over for 95% of programming out there. Now the name of the game is to play connect-the-dots between this framework and that API.

There are still people who try to actually understand what they are doing. Sadly, by the time these people figure out what the Qt bindings actually mean, their competitors that cut-and-pasted everything from Google and asked a few questions on stackexchange are shipping their code (and getting the rewards).

This does have long-term consequences and speaks to the general lack of quality in code today but it is a reality in a lot of places.

We have a thread on here now and then about the current youth fetish prevalent in programming circles. I think this is one place where it comes from. Learning Python or Java is pretty easy if you understand programming. To become useful, it isn't the language, it's the frameworks and APIs you know.
 
  • Like
Likes Locrian and Greg Bernhardt
  • #13
175
15
I'm a computer science major, and i feel like i'm wasting my youth learning things that soon will no longer be useful. IT is evolving at such an explosive rate that what i learn today may change tomorrow. For example, the textbooks we use right now would be of no use to us in a decade, but the engineering majors (and all other majors) may use their textbooks decades from now. Heck, to save money, one could actually use engineering books from the 1960's.
Is there merit in this way of thinking? I feel depressed that i'm wasting my youth accruing skills/knowledge that will soon no longer be used, either professionally, economically, personally, or otherwise.
I know that we should be learning constantly during our professional lives, but unlike the engineers, whose post-graduation learning would better their understanding of the physical world in the process, learning computer science post graduation requires one to learn arbitrary changes in arbitrary things: manmade programming languages.

Does this way of thinking seem to be a sign that computer science is not the right major for me? Does engineering seem like a better fit?
Before I go on, I know that in some schools the computer science department is a part of the engineering school while in others it is a part of the art and science school. Is there a difference between these? Engineering is a professional degree school, so I can imagine that studying computer science from there will most likely keep in touch with industry.

On the other hand, most of my friends have studied in the art and science schools. In addition to studying hard during school, they did internships and then graduated and got jobs. That is what you should do. Just because your school is teaching you X does not mean you cannot learn Y. Depending on the company, they won't care if you donn't know Y in the beginning, but they will work on finding out if you can learn. Are you self-motivated, independent, and can you also be a team player?

If you do not enjoy your comp sci courses and if you are not interested in self-teaching yourself anything to prepare yourself for the industry, then you should probably not major in computer science.
 
Last edited:
  • #14
WWGD
Science Advisor
Gold Member
5,421
3,691
Sorry I don't mean to belittle neither programming nor programmers, specially considering how my best moment at programming came when I sweated for around an hour just to get a "Hello World" . It just seems, like Greg said, to have somewhat lower barriers for entry. But yes, unfortunately there are always the hacks, like in any other area.
 
  • #15
1,768
126
A lot of topics in computer science don't seem that easy to learn on your own. I can teach myself most math subjects on my own, unless it's something really obscure, advanced, and not very well-documented, like, say, gauge theory in topology (it's documented, but the literature is so intimidating and extensive that you don't really know how to prioritize it, and what you really need is for someone who works in the field to give you a high-level overview). But stuff like compiler design or operating systems seems kind of hard to teach yourself. I gave up on teaching myself those things, not that they are too hard, but I realized they aren't high priority for me to learn, so the difficulty is above what I am willing to put the effort into learning at the moment, given my time constraints. Also, just learning C++ to write some simple programs is easy enough, but it's the software engineering principles of how to not write terrible code that are not so easy to learn.

I think the whole idea of things getting outdated is over-stated, unless it's pretty specific stuff that you're spending a lot of time on. I learned C++ in my programming classes ages ago, like 2000-2001 and it's very, very similar to how it is today. I'm only aware of a couple differences, like, now there's for each loop in C++, in the latest versions. But that's hardly an earth-shattering development. Maybe the more detailed stuff like the standard template libraries might have changed more, but I think the fundamentals are pretty much the same. Other languages might be less established, so maybe they change more, but I don't think you're going to have a really hard time learning the latest version of Java, if you knew Java of 10 years ago. There might be a few things to get used to, but I don't think it's anything too major, unless it's like some obscure library that's just not really used anymore or something like that. You might have to learn some stuff that's going to be outdated, but if you keep in mind that that may be a means to an end, which is just to get an idea of how to program computers to solve problems, I think it's not so bad. You may use some obscure thing that is going to be outdated in order to do the task that's given to you, but the point might not be to learn those little details, but how to solve the problem.
 
  • #16
941
394
Even some engineering fields are rapidly-evolving and thus require textbook changes often. I'm thinking particularly of parts of communications engineering, seeing as how much the internet is invading our lives nowadays.
 
  • #17
WWGD
Science Advisor
Gold Member
5,421
3,691
Actually, even nostalgia is changing nowadays --nostalgia is not what it used to be.
 
  • Like
Likes elkement and atyy
  • #18
218
9
Even some engineering fields are rapidly-evolving and thus require textbook changes often. I'm thinking particularly of parts of communications engineering, seeing as how much the internet is invading our lives nowadays.
wouldn't "communications engineering" be a subfield of software development, and hence computer science?
 
  • #19
1,768
126
wouldn't "communications engineering" be a subfield of software development, and hence computer science?
It's much more of an electrical engineering discipline than computer science. Not all of it is software development.
 
  • Like
Likes analogdesign and ulianjay
  • #20
834
266
I spent much of my career as a young adult figuring out how analog microwave radio systems worked. I made them perform at levels better than specification. Today, that analog microwave technology is gone. Few, if any, such installations are still in service. Do I regret it? Nope. I learned new stuff. And if you aren't always learning new stuff, you've missed one of the primary lessons from college: You need to learn how to teach yourself, absorb new abstract ideas, and apply them.

You ask if you're learning something that will soon be obsolete. Maybe you are. So what? Learn what you need to educate yourself. You are not some automaton, programmed in college, to do the same damned thing over and over in the real world. If you think you'll do better by studying Engineering, note that the engineering I do today is very different from the engineering I did 20 years ago. There has been a lot of study since then. Unless you think flipping burgers is a career, get used to the idea that you'll always have to be learning new things.
 
  • Like
Likes ulianjay and WWGD
  • #21
374
7
It is not hard to write an algorithm.
It is hard to look at your end product, seeing that it is not working correctly and determining why.

Also when you get a job you will most likely not be able to post your problem online for the world to view
LOL :D
 
  • #22
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
26,429
9,936
, get used to the idea that you'll always have to be learning new things.
One common theme of messages in the AG section is "I learned something (or will learn something) that I might not absolutely have to know!". I don't understand this attitude, and it certainly won't go well with the real world.
 
  • Like
Likes WWGD
  • #23
WWGD
Science Advisor
Gold Member
5,421
3,691
I spent much of my career as a young adult figuring out how analog microwave radio systems worked. I made them perform at levels better than specification. Today, that analog microwave technology is gone. Few, if any, such installations are still in service. Do I regret it? Nope. I learned new stuff. And if you aren't always learning new stuff, you've missed one of the primary lessons from college: You need to learn how to teach yourself, absorb new abstract ideas, and apply them.

You ask if you're learning something that will soon be obsolete. Maybe you are. So what? Learn what you need to educate yourself. You are not some automaton, programmed in college, to do the same damned thing over and over in the real world. If you think you'll do better by studying Engineering, note that the engineering I do today is very different from the engineering I did 20 years ago. There has been a lot of study since then. Unless you think flipping burgers is a career, get used to the idea that you'll always have to be learning new things.
Actually , I know of restaurants nowadays that use induction cooking, so even very basic cooking requires learning new things in some cases. Tho I don't know if this Nu Wave method is used for flipping burgers.

And I would go further and state that there are few if any "non-career jobs" nowadays, where you work, punch out the clock and then forget about every thing until the next shift. And the ones that may exist are transient.
 
Last edited:
  • #24
1,768
126
One common theme of messages in the AG section is "I learned something (or will learn something) that I might not absolutely have to know!". I don't understand this attitude, and it certainly won't go well with the real world.
It's a matter of degree. I once made the point that it's not the end of the world if you take one class or two classes that you never end up using, but it is the end of the world if you do what I did and get a whole PhD you never end up using. I was exaggerating, of course. I don't regret learning general math, even though there's a very good chance I'll only use the vast majority of it as a hobby, but I do regret spending so much time on the more specialized stuff that doesn't seem to contribute very much to my general understanding of the world. If I had studied real physics, I probably wouldn't feel as much regret because I would feel like I understood something real and tangible. In fact, the math that I can relate to physics or other applications is generally what I don't regret knowing, even if I never use it. There's a big difference between being a physicist who is doing research, but doesn't use half the other physics he learned, and someone who was a physics major, but then became an actuary. Or pizza delivery man.

That kind of puts it in perspective, though. I think I have a right to be upset because I am 33 years old and I could have accomplished so much if I had not drowned myself in grad school and maybe left with a masters and transferred to get a PhD or second masters in a more marketable subject. But some computer science major isn't making the same time commitment, and they will find it significantly easier to get a job, so I don't think they have much to complain about, unless it's a really sub-par program.

There's another aspect to this which is ending up defenseless in the job market because you learned stuff that has little or no practical value or very obscure niche practical value. But computer science is one of the better ones there. You may never use compiler design directly, but you will use it because it made you a better programmer.
 
  • #25
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
26,429
9,936
Homeomorphic, I don't think your problems arose from having learned too much.
 

Related Threads on Computer science major = waste of youth?

Top