Computer science major = waste of youth?

AI Thread Summary
The discussion centers around concerns that a computer science major may lead to wasted youth due to the rapid evolution of technology and the perceived obsolescence of learned skills. Participants argue that while technology changes quickly, foundational knowledge in computer science, such as algorithms and data structures, remains relevant and applicable across various programming languages. Many emphasize the importance of adaptability and continuous learning in the field, suggesting that a genuine interest in technology is crucial for success. Additionally, some point out that the structure of computer science programs can vary significantly, with some focusing too heavily on trendy topics rather than core principles. Ultimately, the consensus is that if one is not passionate about learning in this dynamic field, they may want to reconsider their choice of major.
annoyinggirl
Messages
217
Reaction score
10
I'm a computer science major, and i feel like I'm wasting my youth learning things that soon will no longer be useful. IT is evolving at such an explosive rate that what i learn today may change tomorrow. For example, the textbooks we use right now would be of no use to us in a decade, but the engineering majors (and all other majors) may use their textbooks decades from now. Heck, to save money, one could actually use engineering books from the 1960's.
Is there merit in this way of thinking? I feel depressed that I'm wasting my youth accruing skills/knowledge that will soon no longer be used, either professionally, economically, personally, or otherwise.
I know that we should be learning constantly during our professional lives, but unlike the engineers, whose post-graduation learning would better their understanding of the physical world in the process, learning computer science post graduation requires one to learn arbitrary changes in arbitrary things: manmade programming languages.

Does this way of thinking seem to be a sign that computer science is not the right major for me? Does engineering seem like a better fit?
 
Last edited:
Physics news on Phys.org
I understand the pressure of a rapidly evolving field, however I think your perspective is off. First off it's not progressing as fast as you make it seem. Computer Science textbooks from 2005 would still be very relevant and useful. Even the "new kid on the block" Ruby language was around then.

For the most part undergrad work is designed to teach you how to think rather than what to think. Once you build foundations, the various related technologies are more easily grasped and build off each other. For example, take programming languages. Yes there are dozens of relevant languages and more coming out each year, but a person with a background in a primary language like C, Java, Python etc etc will be very able to learn a different language because they all do the same thing, just in different ways. I had a friend in college who had decent experience with C and he literally learned PHP over a weekend.

If you are not excited about learning new technologies then perhaps Computer Science is not for you, but I wouldn't panic that it's progressing at a rate that is impossible for you to succeed. If you enjoy it, do it and stop worrying.
 
Specific languages and technologies come and go. Fundamental stuff like algorithms, data structures, and good practices in software engineering, stick around and evolve much more slowly.
 
  • Like
Likes Greg Bernhardt
Jtbell is absolutely correct. However, there are colleges that have CS programs that don't teach these - it's a laundry list of languages, sprinkled with things that are trendy. ("Let's write an app!"). If you are in one of these programs, you might thing about finding a better one.
 
And some programs probably front-load the trendy stuff at the beginning in order to attract students, then shift to the serious theory in upper-level classes. 10-15 years ago, many CS programs (including the AP CS courses) switched from C++ to Java for the introductory programming courses, because Java was seen as more "relevant" and marketable. Now it's robots and mobile apps. So if you're still near the beginning of your CS major, look at the upper-division courses you'll be taking, and see whether they're more of the same, or become more serious.
 
Isn't CS something you can learn on your own for the most part? My CS firiends would chuckle every time I would
bring up getting a CS degree other than to just "have that piece of paper". A PC or Mac, a compiler, maybe one of those
Dummies books or other types of books. What else?

EDIT: I mean that you could get a degree in something else --maybe engineering--where you
will use programming -- and then get some sort of accreditation as a programmer to convince an
employer that you know how to program. Just a thought.
 
WWGD said:
Isn't CS something you can learn on your own for the most part?
I don't understand this. There are plenty of extremely advanced topics in CS just like any other science.

For the most part anyone can start working through a textbook in any subject. Schools are about discipline, connections and insight/support from staff.
 
  • Like
Likes ulianjay
But
Greg Bernhardt said:
I don't understand this. There are plenty of extremely advanced topics in CS just like any other science.

For the most part anyone can start working through a textbook in any subject. Schools are about discipline, connections and insight/support from staff.

EDITTrue, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand, through the error messages) what is wrong with your code. And that is not the case in many other areas. There are also large support communities in CS, though also in other areas now, like, e.g., PF itself and the Stack Exchange sites.
 
WWGD said:
But

True, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand) what is wrong with your code. And that is not the case in many other areas.
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.
 
  • Like
Likes ulianjay
  • #10
Greg Bernhardt said:
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.

You're right, I guess I was thinking more about programming itself than about CS in general.
 
  • #11
WWGD said:
ButEDITTrue, but CS allows you to have instant input : your program runs or not, and you are told (albeit in a way that is not always easy to understand, through the error messages) what is wrong with your code. And that is not the case in many other areas. There are also large support communities in CS, though also in other areas now, like, e.g., PF itself and the Stack Exchange sites.
It is not hard to write an algorithm.
It is hard to look at your end product, seeing that it is not working correctly and determining why.

Also when you get a job you will most likely not be able to post your problem online for the world to view
 
  • #12
Greg Bernhardt said:
Computer Science is a lot more than just crunching code though. However it is probably true that CS is a more accessible field than most in terms of self learning.

For better or for worse, professional programming has changed a lot in the last 10 - 15 years. The days of people writing their own function libraries, database wrappers, threading code, and so on is over for 95% of programming out there. Now the name of the game is to play connect-the-dots between this framework and that API.

There are still people who try to actually understand what they are doing. Sadly, by the time these people figure out what the Qt bindings actually mean, their competitors that cut-and-pasted everything from Google and asked a few questions on stackexchange are shipping their code (and getting the rewards).

This does have long-term consequences and speaks to the general lack of quality in code today but it is a reality in a lot of places.

We have a thread on here now and then about the current youth fetish prevalent in programming circles. I think this is one place where it comes from. Learning Python or Java is pretty easy if you understand programming. To become useful, it isn't the language, it's the frameworks and APIs you know.
 
  • Like
Likes Locrian and Greg Bernhardt
  • #13
annoyinggirl said:
I'm a computer science major, and i feel like I'm wasting my youth learning things that soon will no longer be useful. IT is evolving at such an explosive rate that what i learn today may change tomorrow. For example, the textbooks we use right now would be of no use to us in a decade, but the engineering majors (and all other majors) may use their textbooks decades from now. Heck, to save money, one could actually use engineering books from the 1960's.
Is there merit in this way of thinking? I feel depressed that I'm wasting my youth accruing skills/knowledge that will soon no longer be used, either professionally, economically, personally, or otherwise.
I know that we should be learning constantly during our professional lives, but unlike the engineers, whose post-graduation learning would better their understanding of the physical world in the process, learning computer science post graduation requires one to learn arbitrary changes in arbitrary things: manmade programming languages.

Does this way of thinking seem to be a sign that computer science is not the right major for me? Does engineering seem like a better fit?

Before I go on, I know that in some schools the computer science department is a part of the engineering school while in others it is a part of the art and science school. Is there a difference between these? Engineering is a professional degree school, so I can imagine that studying computer science from there will most likely keep in touch with industry.

On the other hand, most of my friends have studied in the art and science schools. In addition to studying hard during school, they did internships and then graduated and got jobs. That is what you should do. Just because your school is teaching you X does not mean you cannot learn Y. Depending on the company, they won't care if you donn't know Y in the beginning, but they will work on finding out if you can learn. Are you self-motivated, independent, and can you also be a team player?

If you do not enjoy your comp sci courses and if you are not interested in self-teaching yourself anything to prepare yourself for the industry, then you should probably not major in computer science.
 
Last edited:
  • #14
Sorry I don't mean to belittle neither programming nor programmers, specially considering how my best moment at programming came when I sweated for around an hour just to get a "Hello World" . It just seems, like Greg said, to have somewhat lower barriers for entry. But yes, unfortunately there are always the hacks, like in any other area.
 
  • #15
A lot of topics in computer science don't seem that easy to learn on your own. I can teach myself most math subjects on my own, unless it's something really obscure, advanced, and not very well-documented, like, say, gauge theory in topology (it's documented, but the literature is so intimidating and extensive that you don't really know how to prioritize it, and what you really need is for someone who works in the field to give you a high-level overview). But stuff like compiler design or operating systems seems kind of hard to teach yourself. I gave up on teaching myself those things, not that they are too hard, but I realized they aren't high priority for me to learn, so the difficulty is above what I am willing to put the effort into learning at the moment, given my time constraints. Also, just learning C++ to write some simple programs is easy enough, but it's the software engineering principles of how to not write terrible code that are not so easy to learn.

I think the whole idea of things getting outdated is over-stated, unless it's pretty specific stuff that you're spending a lot of time on. I learned C++ in my programming classes ages ago, like 2000-2001 and it's very, very similar to how it is today. I'm only aware of a couple differences, like, now there's for each loop in C++, in the latest versions. But that's hardly an earth-shattering development. Maybe the more detailed stuff like the standard template libraries might have changed more, but I think the fundamentals are pretty much the same. Other languages might be less established, so maybe they change more, but I don't think you're going to have a really hard time learning the latest version of Java, if you knew Java of 10 years ago. There might be a few things to get used to, but I don't think it's anything too major, unless it's like some obscure library that's just not really used anymore or something like that. You might have to learn some stuff that's going to be outdated, but if you keep in mind that that may be a means to an end, which is just to get an idea of how to program computers to solve problems, I think it's not so bad. You may use some obscure thing that is going to be outdated in order to do the task that's given to you, but the point might not be to learn those little details, but how to solve the problem.
 
  • #16
Even some engineering fields are rapidly-evolving and thus require textbook changes often. I'm thinking particularly of parts of communications engineering, seeing as how much the internet is invading our lives nowadays.
 
  • #17
Actually, even nostalgia is changing nowadays --nostalgia is not what it used to be.
 
  • Like
Likes elkement and atyy
  • #18
axmls said:
Even some engineering fields are rapidly-evolving and thus require textbook changes often. I'm thinking particularly of parts of communications engineering, seeing as how much the internet is invading our lives nowadays.
wouldn't "communications engineering" be a subfield of software development, and hence computer science?
 
  • #19
wouldn't "communications engineering" be a subfield of software development, and hence computer science?

It's much more of an electrical engineering discipline than computer science. Not all of it is software development.
 
  • Like
Likes analogdesign and ulianjay
  • #20
I spent much of my career as a young adult figuring out how analog microwave radio systems worked. I made them perform at levels better than specification. Today, that analog microwave technology is gone. Few, if any, such installations are still in service. Do I regret it? Nope. I learned new stuff. And if you aren't always learning new stuff, you've missed one of the primary lessons from college: You need to learn how to teach yourself, absorb new abstract ideas, and apply them.

You ask if you're learning something that will soon be obsolete. Maybe you are. So what? Learn what you need to educate yourself. You are not some automaton, programmed in college, to do the same damned thing over and over in the real world. If you think you'll do better by studying Engineering, note that the engineering I do today is very different from the engineering I did 20 years ago. There has been a lot of study since then. Unless you think flipping burgers is a career, get used to the idea that you'll always have to be learning new things.
 
  • Like
Likes ulianjay and WWGD
  • #21
donpacino said:
It is not hard to write an algorithm.
It is hard to look at your end product, seeing that it is not working correctly and determining why.

Also when you get a job you will most likely not be able to post your problem online for the world to view
LOL :D
 
  • #22
JakeBrodskyPE said:
, get used to the idea that you'll always have to be learning new things.

One common theme of messages in the AG section is "I learned something (or will learn something) that I might not absolutely have to know!". I don't understand this attitude, and it certainly won't go well with the real world.
 
  • Like
Likes WWGD
  • #23
JakeBrodskyPE said:
I spent much of my career as a young adult figuring out how analog microwave radio systems worked. I made them perform at levels better than specification. Today, that analog microwave technology is gone. Few, if any, such installations are still in service. Do I regret it? Nope. I learned new stuff. And if you aren't always learning new stuff, you've missed one of the primary lessons from college: You need to learn how to teach yourself, absorb new abstract ideas, and apply them.

You ask if you're learning something that will soon be obsolete. Maybe you are. So what? Learn what you need to educate yourself. You are not some automaton, programmed in college, to do the same damned thing over and over in the real world. If you think you'll do better by studying Engineering, note that the engineering I do today is very different from the engineering I did 20 years ago. There has been a lot of study since then. Unless you think flipping burgers is a career, get used to the idea that you'll always have to be learning new things.

Actually , I know of restaurants nowadays that use induction cooking, so even very basic cooking requires learning new things in some cases. Tho I don't know if this Nu Wave method is used for flipping burgers.

And I would go further and state that there are few if any "non-career jobs" nowadays, where you work, punch out the clock and then forget about every thing until the next shift. And the ones that may exist are transient.
 
Last edited:
  • #24
One common theme of messages in the AG section is "I learned something (or will learn something) that I might not absolutely have to know!". I don't understand this attitude, and it certainly won't go well with the real world.

It's a matter of degree. I once made the point that it's not the end of the world if you take one class or two classes that you never end up using, but it is the end of the world if you do what I did and get a whole PhD you never end up using. I was exaggerating, of course. I don't regret learning general math, even though there's a very good chance I'll only use the vast majority of it as a hobby, but I do regret spending so much time on the more specialized stuff that doesn't seem to contribute very much to my general understanding of the world. If I had studied real physics, I probably wouldn't feel as much regret because I would feel like I understood something real and tangible. In fact, the math that I can relate to physics or other applications is generally what I don't regret knowing, even if I never use it. There's a big difference between being a physicist who is doing research, but doesn't use half the other physics he learned, and someone who was a physics major, but then became an actuary. Or pizza delivery man.

That kind of puts it in perspective, though. I think I have a right to be upset because I am 33 years old and I could have accomplished so much if I had not drowned myself in grad school and maybe left with a masters and transferred to get a PhD or second masters in a more marketable subject. But some computer science major isn't making the same time commitment, and they will find it significantly easier to get a job, so I don't think they have much to complain about, unless it's a really sub-par program.

There's another aspect to this which is ending up defenseless in the job market because you learned stuff that has little or no practical value or very obscure niche practical value. But computer science is one of the better ones there. You may never use compiler design directly, but you will use it because it made you a better programmer.
 
  • #25
Homeomorphic, I don't think your problems arose from having learned too much.
 
  • #26
Homeomorphic, I don't think your problems arose from having learned too much.

Not learning too much, but learning the wrong stuff.
 
  • #27
@homeomorphic:
I have read many posts where you talk about wanting to break away from your Mathematical past. Just curious, why then do you keep using
the handle 'homeomorphic'? It screams of Math nerd, or die-hard math fanatic.
 
  • #28
I joined physics forums back when I was still in grad school and not really suspecting it would turn out that I hate topology beyond a certain point. I just don't consider it that important to change the name. It is what I got my PhD in, so it's still somewhat accurate. I didn't really stop liking the topology that I used to like--I just never started liking the research-level stuff when I got there, and I also was disappointed that I wasn't able to make the connection to physics as well as I'd hoped. Plus, there's the other point I make about the 10,001st theorem you learn not adding very much. There's just a certain level of complexity that starts triggering my gag reflex, especially if there are no applications anywhere in sight. I always though of myself more as a mathematical physicist, even from the beginning, which is one reason I was so disappointed with grad school in math. So, in some ways, I haven't changed that much. The issue was that I didn't know what I was signing up for, more than that I have changed, although I did change quite a bit, too.

I'm not completely breaking with my mathematical past. It's just that I'm a very hardcore applied guy now. As one of my professors from undergrad said, "I was a topology student, but I recovered."

It was a lot like getting into drugs or something. At first, it draws you in, but then after a few years, you're in this hell on earth, and you wonder how you could have been so stupid to do something like that to yourself.
 
  • #29
homeomorphic said:
It was a lot like getting into drugs or something. At first, it draws you in, but then after a few years, you're in this hell on earth, and you wonder how you could have been so stupid to do something like that to yourself.

working in physics ≅ serious drug addiction
 
  • #30
donpacino said:
working in physics ≅ serious drug addiction
Correction: what homeomorphic meant was learning excessively theoretical MATH (with little to no application) = serious drug addiction. Anything that feels rewarding to the brain could turn into addiction - food, romantic love, video games, math, and even physics. It is only when it has negative consequences does addiction warrant concern, but neurologically, even in the absence of negative consequences, it is still addiction.
 
  • #31
homeomorphic said:
I joined physics forums back when I was still in grad school and not really suspecting it would turn out that I hate topology beyond a certain point. I just don't consider it that important to change the name. It is what I got my PhD in, so it's still somewhat accurate. I didn't really stop liking the topology that I used to like--I just never started liking the research-level stuff when I got there, and I also was disappointed that I wasn't able to make the connection to physics as well as I'd hoped. Plus, there's the other point I make about the 10,001st theorem you learn not adding very much. There's just a certain level of complexity that starts triggering my gag reflex, especially if there are no applications anywhere in sight. I always though of myself more as a mathematical physicist, even from the beginning, which is one reason I was so disappointed with grad school in math. So, in some ways, I haven't changed that much. The issue was that I didn't know what I was signing up for, more than that I have changed, although I did change quite a bit, too.

I'm not completely breaking with my mathematical past. It's just that I'm a very hardcore applied guy now. As one of my professors from undergrad said, "I was a topology student, but I recovered."

It was a lot like getting into drugs or something. At first, it draws you in, but then after a few years, you're in this hell on earth, and you wonder how you could have been so stupid to do something like that to yourself.

why didn't you go to grad school for physics if you knew from the start that you were a mathematical physicist? I mean, it's one thing to enter grad school with a passion for pure maths and then graduate with a disdain for how little application there is. It is another to know that you were a mathematical physicists from the "beginning", enter maths grad school, and graduate disappointed that there was no application. I don't mean to disrespect you or to ridicule your decision ; i am just curious to why you chose maths grad school when you knew you were a mathematical physicist.
 
  • #32
JakeBrodskyPE said:
I spent much of my career as a young adult figuring out how analog microwave radio systems worked. I made them perform at levels better than specification. Today, that analog microwave technology is gone. Few, if any, such installations are still in service. Do I regret it? Nope. I learned new stuff. And if you aren't always learning new stuff, you've missed one of the primary lessons from college: You need to learn how to teach yourself, absorb new abstract ideas, and apply them.

You ask if you're learning something that will soon be obsolete. Maybe you are. So what? Learn what you need to educate yourself. You are not some automaton, programmed in college, to do the same damned thing over and over in the real world. If you think you'll do better by studying Engineering, note that the engineering I do today is very different from the engineering I did 20 years ago. There has been a lot of study since then. Unless you think flipping burgers is a career, get used to the idea that you'll always have to be learning new things.

thank you for sharing your experiences with me. However, i feel that it is not exactly a valid analogy. This is because the technology that you learned for analog microwave technology is still embedded in physics. Your engineering degree taught you physics, which would NEVER change. Also, because what you learned for analog microwave technology is physics, it could be applied to newer technologies as well. At the very least, it taught you better understanding of the physical world. The same cannot be said for programming, where the languages are man-made and arbitrary. New programming languages come and go, and unlike physics, they will always change.
 
  • #33
homeomorphic said:
It's a matter of degree. I once made the point that it's not the end of the world if you take one class or two classes that you never end up using, but it is the end of the world if you do what I did and get a whole PhD you never end up using. I was exaggerating, of course. I don't regret learning general math, even though there's a very good chance I'll only use the vast majority of it as a hobby, but I do regret spending so much time on the more specialized stuff that doesn't seem to contribute very much to my general understanding of the world. If I had studied real physics, I probably wouldn't feel as much regret because I would feel like I understood something real and tangible. In fact, the math that I can relate to physics or other applications is generally what I don't regret knowing, even if I never use it. There's a big difference between being a physicist who is doing research, but doesn't use half the other physics he learned, and someone who was a physics major, but then became an actuary. Or pizza delivery man.

That kind of puts it in perspective, though. I think I have a right to be upset because I am 33 years old and I could have accomplished so much if I had not drowned myself in grad school and maybe left with a masters and transferred to get a PhD or second masters in a more marketable subject. But some computer science major isn't making the same time commitment, and they will find it significantly easier to get a job, so I don't think they have much to complain about, unless it's a really sub-par program.

There's another aspect to this which is ending up defenseless in the job market because you learned stuff that has little or no practical value or very obscure niche practical value. But computer science is one of the better ones there. You may never use compiler design directly, but you will use it because it made you a better programmer.

However inapplicable pure maths may be, it will never change. Plus, a PhD in maths would allow you to enter academia, where you would age like wine as an employee (tenure = unfirable). Programmers age like fruit : at age 35 onwards, your value as an employee drops like a rock, because of something called " temporary knowledge capital", coined by the author of the article linked below.

http://www.halfsigma.com/2007/03/why_a_career_in.html
http://heather.cs.ucdavis.edu/h1b.html
http://thedailywtf.com/articles/up-or-out-solving-the-IT-turnover-crisis
http://www.techrepublic.com/blog/career-management/is-engineering-now-a-young-mans-game/
 
Last edited by a moderator:
  • #34
annoyinggirl said:
The same cannot be said for programming, where the languages are man-made and arbitrary. New programming languages come and go, and unlike physics, they will always change.

I'd just like to add that while this is technically true, most the concepts behind the programming languages have not changed.
 
  • #35
why didn't you go to grad school for physics if you knew from the start that you were a mathematical physicist?

The key word there is "mathematical". I was afraid things would not be sufficiently mathematical if I went to grad school in physics. Back then I used to be really big on rigor, for one thing, but also having really good intuition and conceptual understanding and not just blindly calculating stuff all over the place as some physicists seem to do. I know Lee Smolin mentioned how he was disappointed when he studied physics for these kinds of reasons. In retrospect, the disconnect from reality in a math department turned out to be a bigger thorn in my side than the some of the ugly calculations I would have had to put up with if I had studied physics, and I'm not concerned quite as much with rigor as I used to be. Still probably wouldn't have worked out well, though. I really should have stuck with electrical engineering or computer science.

I mean, it's one thing to enter grad school with a passion for pure maths and then graduate with a disdain for how little application there is. It is another to know that you were a mathematical physicists from the "beginning", enter maths grad school, and graduate disappointed that there was no application. I don't mean to disrespect you or to ridicule your decision ; i am just curious to why you chose maths grad school when you knew you were a mathematical physicist.

I got the impression that there were closer ties between math and physics than was actually the case. As it turned out, the math department basically had no communication with the physics department to speak of, except for, ironically, my adviser, who served as a mathematical consultant for one of the physics profs, a little bit. Yet, he discouraged me from straying too far from topology. I'm not as bothered by lack of application as you might think. I don't really have a problem with studying a few books that are motivated more from the point of view of studying math internally. What I have a problem with is spending significant portions of my life on it. It is conceivable that math needs to be that way to a certain extent, but I'm not sure what that extent is. I do think the current level of concern for applications is pretty transparently not as much as it should be. Even if someone doesn't work on applications, they ought to at least have more awareness of them. There should be a class or at least somebody should make a documentary or book that every mathematician should be required to watch/read that talks about what sophisticated math can contribute to science and technology that doesn't just do it in a wishy-washy way, but shows the actual practical results that were achieved. I'm okay with the purpose of math not primarily being applications for some people. But I do think the justification for paying people to do research is the possibility that they will hit on something that will one day have practical value, even if it's because it improves our ability to crack certain kinds of problems and not direct application.
 
Last edited:
  • #36
However inapplicable pure maths may be, it will never change.

If it's really inapplicable, then what's the point of not changing? Its worth is zero and since it can't change, its worth will always be zero. I actually don't completely dismiss the value of math for math's sake, but I think that applies more to simple things than the kind of baroque, ultra-complicated stuff I see mathematicians working on today. On the one hand, you could look at something like a proof by pictures of the Pythagorean theorem that doesn't require a ton of background to appreciate. I once explained it to a graduate student in music, and I could tell he was really excited. Ignoring the immense practicality of the Pythagorean theorem, that kind of mathematical beauty comes quite cheaply, and I would say it's worthwhile, just purely for the fun of it! But as you get deeper and deeper, the price of admission gets bigger and bigger. You could take a theorem like the spectral theorem of functional analysis that's pretty deep and not really accessible to laymen, but you don't exactly need to devote your life to understanding it. If you go much further than that, it gets to some unknown territory for me. Maybe I'd see more value in more math if I understood more of it, but I do have the distinct impression of having diminishing returns, as far as I did get. I don't think it's worth all the intense competition and all the insanely hard work. Not even by a long shot. Not for me, anyway. Unless it can make a difference in people's lives.

I think you're not seeing the forest through the trees when it comes to programming. Programming is very logical. The particular details of syntax are not that important. The concept of a for loop makes sense, independent of the details of how it has been implemented in various languages. It's DRY. Don't repeat yourself. If you didn't have loops, you'd have to type the same code over and over, possibly hundreds of times. The necessity of iteration can therefore be viewed almost as if it were a timeless mathematical truth. When you are stuck looking at the trees, you don't see this forest. Don't worry about the fact that it has to be

for(int i = 0; i<100; i++)
{
// do stuff
}

Who cares about that? If you learn it in C++, it's the same concept in Python, even if the exact way that you write it out is slightly different.

It's a form of engineering. Do you think if someone designs a car that they had to make every piece exactly the way they did or was there some wiggle room? Obviously, there was some wiggle room--just look at all the different kinds of cars out there. And math is that way to some degree, too. A proof doesn't have to be done exactly the way they did it. There may be 50 different ways to prove the same thing. Which way do you choose? As it turns out, the choice in the way the same piece of mathematics is proven can evolve over time.

Plus, a PhD in maths would allow you to enter academia, where you would age like wine as an employee (tenure = unfirable). Programmers age like fruit - at 40+, your value as an employee drops like a rock.

I have a PhD in math and to put it in Biblical terms, it would be easier for a camel to pass through the eye of a needle than for me to enter academia. Except as an adjunct, but I'm not sure that's any better than flipping burgers, prestige aside--there's definitely no aging like wine there.

I'm not sure it's really true that programmers age like fruit. Especially someone like me because if I break into programming, my math expertise will open a lot more doors when combined with being able to say I worked professionally using Java or C++ for X years.
 
  • Like
Likes ulianjay
  • #37
jtbell said:
Specific languages and technologies come and go. Fundamental stuff like algorithms, data structures, and good practices in software engineering, stick around and evolve much more slowly.

This is what I was going to say.

Many people say that the book "Structure and Interpretation of Computer Programs" by Gerald Jay Sussman and Hal Abelson revolutionized the way CS is taught. Instead of spending all the time learning specifics about some programming language, you spend more time learning fundamental things that never change like algorithms, data structures, maths, etc.

It didn't revolutionize it at all universities, many universities still use the old methods, in which students spend 4 years learning the specifics of languages and when they graduate, they can't program in any language other than the ones they learned.

Try to take as many subjects about fundamental things and less subjects about specific languages or specific technologies. You can always learn those on your own, like if you know C language, you can easily pick up web development due to common syntax.

What I did is a double major in EE and CS and I love the EE stuff. Can you do that? CS goes well with many majors, like maths, biology, physics, economics etc. and makes your experience at university more interesting.
 
  • #38
annoyinggirl said:
thank you for sharing your experiences with me. However, i feel that it is not exactly a valid analogy. This is because the technology that you learned for analog microwave technology is still embedded in physics. Your engineering degree taught you physics, which would NEVER change. Also, because what you learned for analog microwave technology is physics, it could be applied to newer technologies as well. At the very least, it taught you better understanding of the physical world. The same cannot be said for programming, where the languages are man-made and arbitrary. New programming languages come and go, and unlike physics, they will always change.

I see why you adopted the name of annoyinggirl :)

At some level, everything boils down to math and physics. And you're right, it doesn't change. But the methods, materials, applications, and design practices DO change. Driving a truck uses math and physics too. So what?

You could also argue the same thing about ethics. One could say that ethics all boil down to the Ten Commandments. While that may even be true, it doesn't tell you how to handle everyday situations you are likely to encounter.

This is not about learning math and physics. At the end of the day, while those concepts are almost timeless, the applications do change. And we're not talking about small changes, we're talking about enormous transformations, that cause social and even ethical changes.

Schools spend a lot of time teaching those practical things as well. Even the examples they show are examples of practices from every day life. The examples I learned when I was in school are irrelevant today because of newer designs, practices, and materials. Thankfully, I'm always learning new things, and I adapt.

You should too.
 
Back
Top