The State of Being a Professor - an insider's view

In summary: Though I think there is an interesting discussion to be had about the lowered level of mathematical preparedness the average incoming freshman has. All my data is anecdotal and through a very skewed lens (my own, since I don't think comparing a large body of students to a single person is very scientific). ...Does anyone know of any apples to apples comparisons of average incoming college students mathematical preparedness? I am being lazy and haven't really started looking. I will start looking into later next week, but if anyone has anything handy, it would be helpful in giving me a place to start.
  • #1
Norman
897
4
Izabella Laba has an interesting article up about the rigors of being a professor: The state of the profession.

I think it should be required reading for every graduate student thinking about an academic career path.

The article is interesting in that it ponders the feasibility of creating a research track and a teaching track at all levels of the university system. It also asks the question of why one is required to be an innovative teacher to be a successful academic researcher. Those are two seemingly disparate things. What does everyone here think?

Personally, I would love to see more options for the researcher and teacher to be more independent paths. At least some flexibility would be a welcome change. As someone trying to enter the tenure track, I often doubt my desire to take on this extreme profession. I enjoy spending time with my kids and wife. How much of that will I have to give up in order to achieve some stability as an academic? Is it worth it for the stability it brings - because having to jump from postdoc to postdoc to research faculty position is stressful. Very stressful.

I look forward to the discussion.
 
Physics news on Phys.org
  • #2
Thanks for the link. The only part I disagree with is:

"The university professors of old were not evaluated on their ability to inspire interest in otherwise indifferent students, nor did they have to teach the addition and multiplication of fractions. They were not under constant pressure, either, to use clickers, classroom technology, or innovative teaching techniques. "

To me, that's a bit like saying "music was so much better in the 1960s, when people cared about making music instead of making money". Both comments are drastic oversimplifications.

For whatever reason, teaching is not prestigious. I'm not sure it ever was, and most likely, it never will be. Academia has trending towards a business model for several decades now, and in business, whatever brings in money or 'adds value to the brand' is more important than anything. If I want teaching to add value, I need to devise a curriculum or course that is unique- that's possible to some degree, and is reflected in specialized degrees/course offerings that sporadically occur- getting a MS in nanotechnology, for example.

What is missing from her post (and others) is an objective comparison between academia and industry. Guess what- if you want to excel in industry you also have to be willing to give up evenings and weekends. You will also have a boss who tells you what to do, who probably doesn't understand what you do, and isn't particularly interested in what you do as long as your work helps him/her to please his/her boss. From that perspective, the only difference between industry and academia is that I exchange some financial stability (a portion of my salary comes from grants) for the privilege of choosing what I want to do with my time- I don't have to get approval from a supervisor.

The blog post should be read in terms of balancing priorities, rather than issues specific to her profession.
 
  • #3
Andy Resnick said:
Thanks for the link. The only part I disagree with is:

"The university professors of old were not evaluated on their ability to inspire interest in otherwise indifferent students, nor did they have to teach the addition and multiplication of fractions. They were not under constant pressure, either, to use clickers, classroom technology, or innovative teaching techniques. "

To me, that's a bit like saying "music was so much better in the 1960s, when people cared about making music instead of making money". Both comments are drastic oversimplifications.
This is spot on.

Though I think there is an interesting discussion to be had about the lowered level of mathematical preparedness the average incoming freshman has. All my data is anecdotal and through a very skewed lens (my own, since I don't think comparing a large body of students to a single person is very scientific). I also wouldn't be surprised if the preparedness of the average college student is actually higher now compared to 30 years ago, but our expectations have grown more rapidly. I still tend to fall under the students today are less prepared for college mathematics (and therefore physics), however irrational it may be.

Does anyone know of any apples to apples comparisons of average incoming college students mathematical preparedness? I am being lazy and haven't really started looking. I will start looking into later next week, but if anyone has anything handy, it would be helpful in giving me a place to start.

Andy Resnick said:
For whatever reason, teaching is not prestigious. I'm not sure it ever was, and most likely, it never will be. Academia has trending towards a business model for several decades now, and in business, whatever brings in money or 'adds value to the brand' is more important than anything. If I want teaching to add value, I need to devise a curriculum or course that is unique- that's possible to some degree, and is reflected in specialized degrees/course offerings that sporadically occur- getting a MS in nanotechnology, for example.
But aren't those two notions sort of at odds with each other? Uniqueness is not the only way to bring value. Doing it better than the next provider is another way. Creating a better product so to speak. Wouldn't treating students as consumers drive up the prestige of teachers and therefore create an atmosphere where teaching is more valued? That just doesn't seem to be the case (in my experience). Now, I really have zero understanding of business, so I may be completely off basis here. Feel free to say so and educate me. I would appreciate it.



Andy Resnick said:
What is missing from her post (and others) is an objective comparison between academia and industry. Guess what- if you want to excel in industry you also have to be willing to give up evenings and weekends. You will also have a boss who tells you what to do, who probably doesn't understand what you do, and isn't particularly interested in what you do as long as your work helps him/her to please his/her boss. From that perspective, the only difference between industry and academia is that I exchange some financial stability (a portion of my salary comes from grants) for the privilege of choosing what I want to do with my time- I don't have to get approval from a supervisor.

The blog post should be read in terms of balancing priorities, rather than issues specific to her profession.
Andy, I am very glad you responded in this thread. As someone who has done both industry and academia, I was especially looking forward to what you had to say about this.

I do find it interesting that in a setting known for creativity, expression, and independence that there seems to be such a lack of flexibility in academia when it comes to the academic career.

We also have not touched on this notion put forth by Dr. Laba that cutting edge math (and science by extension) is missing out on some brilliant minds simply because they do not want deal with teaching. In physics, I am not sure this is the case. The a small, but available, number of research only institutions (national labs, NASA, and research centers (are their others?)), the opportunity exists to do research without having to teach.
 
  • #4
Norman said:
<snip>
But aren't those two notions sort of at odds with each other? Uniqueness is not the only way to bring value. Doing it better than the next provider is another way. Creating a better product so to speak. Wouldn't treating students as consumers drive up the prestige of teachers and therefore create an atmosphere where teaching is more valued?

First problem: the difficulty of quantitatively defining 'better' in the context of a standard curriculum. By contrast, I can easily create metrics for research productivity. Second: I am totally against treating students as consumers. It's an increasingly common point of view, but I feel it is totally inappropriate for a variety of reasons I have written on other threads.

Norman said:
I do find it interesting that in a setting known for creativity, expression, and independence that there seems to be such a lack of flexibility in academia when it comes to the academic career.

We also have not touched on this notion put forth by Dr. Laba that cutting edge math (and science by extension) is missing out on some brilliant minds simply because they do not want deal with teaching. In physics, I am not sure this is the case. The a small, but available, number of research only institutions (national labs, NASA, and research centers (are their others?)), the opportunity exists to do research without having to teach.

Academia is a very conservative environment: the only thing that is valued is *success*. Innovation is only rewarded if that person is *successful* being innovative.

As for the final paragraph, I'm not sure that's fair either. At least, she didn't offer any substantive examples. In any case, I can buy out my teaching time with grant dollars, so not wanting to teach isn't much of a barrier to being productive in academia.
 
  • #5
For whatever reason, teaching is not prestigious. I'm not sure it ever was, and most likely, it never will be. Academia has trending towards a business model for several decades now, and in business, whatever brings in money or 'adds value to the brand' is more important than anything. If I want teaching to add value, I need to devise a curriculum or course that is unique- that's possible to some degree, and is reflected in specialized degrees/course offerings that sporadically occur- getting a MS in nanotechnology, for example.

Isn't this a bit dangerous, though? Most people go to a University to gain an education. At the moment, an undergraduate education is a truly expensive investment. Wouldn't treating academia like a business make it more damaging to a student's bottom line? Let's imagine that by some chance, a radical new method of educating people were to enter the market which is introduced by someone who is particularly charismatic. Suppose that it does a better job of teaching someone skills that are necessary in the modern world than traditional academia. What if this method of education is significantly cheaper for the students? Based on the supposition that as demand for research from the professors and demanding more quality teaching adds more stress for the traditional professor, which leads to the dilution of their efforts in both areas. My hypothesis is that our traditional academia which we all know and love, which is already under strain, might collapse.
 
Last edited:
  • #6
FourierFaux said:
Most people go to a University to gain an education.

... except that much of what passes for "univserity education" isn't education, but high level vocational training. For evidence, pick a random sample of posts on PF and analyse the questions being asked.

Of course there is nothing at all wrong with high level vocational training, but calling things by their right names is the first step torwards understanding - even if politicians and marketing consultants usually disagree.

It seems entirely possible that somebody will invent a more effective way to deliver training than the current model, which was invented centuries ago and often only uses 21st century technology as a replacement for 16th century technology (e.g. give a 16th century style lecture, but with a 21st century video camera pointing at the lecturer.)
 
  • #7
FourierFaux said:
Isn't this a bit dangerous, though? Most people go to a University to gain an education. At the moment, an undergraduate education is a truly expensive investment. Wouldn't treating academia like a business make it more damaging to a student's bottom line? <snip>

Treating academia like a business *is* damaging to both the student and the institution, I agree with that. How then to deal with the pervasive pressure by upper administration to commodify learning?

Certainly, shared governance is an essential part of this, at the top level. It also requires faculty committed to experimentation with teaching and the curriculum. Students should also play a part- for example, our Society of Physics Students chapter is highly active in recruiting and outreach activities, and sponsors regular informal interactions between the students and faculty. Students should also get involved with faculty research projects.

Students are increasingly being made responsible for their own educational outcomes- by taking the initiative on research projects, scholarship activities, and semiprofessional social activities.
 
  • #8
We also have not touched on this notion put forth by Dr. Laba that cutting edge math (and science by extension) is missing out on some brilliant minds simply because they do not want deal with teaching. In physics, I am not sure this is the case. The a small, but available, number of research only institutions (national labs, NASA, and research centers (are their others?)), the opportunity exists to do research without having to teach.

The far bigger threat to missing out on brilliant minds is the tremendous career uncertainty, the long training period and low payoff, etc. The labor market for science works like any other- and it should be clear that many brilliant minds turn their talents toward better prospects.

The people who do remain in science do it despite sizable hurdles- one more hurdle (the need to teach) isn't going to dissuade the people who have already signed up for the career.
 
  • #9
Andy Resnick said:
Guess what- if you want to excel in industry you also have to be willing to give up evenings and weekends.

Depends on the industry. I don't have to give up evenings and weekends, although I'm usually too dead tired to do anything research-y.

From that perspective, the only difference between industry and academia is that I exchange some financial stability (a portion of my salary comes from grants) for the privilege of choosing what I want to do with my time- I don't have to get approval from a supervisor.

From my perspective, the main difference is that industry jobs exists for what I want to do whereas academic ones don't.

The original poster asks ...

And why not combine a part-time research appointment with a non-academic job? We could make our living working elsewhere, say at a financial institution, in the tech industry, or at a start-up, and combine that with a university affiliation through which we could supervise graduate students or apply for grants. No, really. Think about it. Many employers are already used to part-time work arrangements,

1) In fact, I know people that do that. The issue here is that the people that do that tend to be reasonably senior managers that the company "trusts" to do this sort of work. The problem with anyone junior doing this sort of thing is that the pressures to produce are going to prevent you from moonlighting.

2) Employers are *NOT* used to part-time work arrangements. Employers that I know hate part-time work arrangements because the effort in coordinating two people working 20 hours each are greater in getting one person working 40 hours a week.

The blog post should be read in terms of balancing priorities, rather than issues specific to her profession.

There is one thing that makes academics different. Academics are supposed to think about general, abstract issues and to be "thought leaders" of society.
 
  • #10
Andy Resnick said:
Treating academia like a business *is* damaging to both the student and the institution, I agree with that. How then to deal with the pervasive pressure by upper administration to commodify learning?

The first step is to realize that upper administration really isn't the problem since they are being driven by social pressures. People are willing to spend money on a university because they think that they can make that money back. This is true for government grants, research funding, and undergraduate students.

If you had people do things for "pure learning" I don't think you'd get enough money to fund professors salaries. Now, you might argue that society shouldn't be based on money, but at that point it's such a big a radical change that my imagination fails me as to how things are run.

As long as you expect professors to get paid, it's going to be a business.

Certainly, shared governance is an essential part of this, at the top level.

The problem with shared governance is that it becomes difficult once you realize that people want different things. What happens in most universities (and in fact most institutions) is that you put real power in the hands of people that can be "culturally selected" to "make the right decisions." If you really share power, it gets interesting when you find that the people that you share it with *don't* have the same beliefs.

If you give graduate students or undergraduates real power (i.e. power to make budgeting and personnel decisions) that will radically change things, but I don't see this happening (since frankly I don't trust most undergraduates to make those decisions wisely).

Students are increasingly being made responsible for their own educational outcomes- by taking the initiative on research projects, scholarship activities, and semiprofessional social activities.

The problem I have here is that it's throwing out responsibility without giving up any real power. This is not shared governance. Shared governance would be having students elect professors (not that I think it's a good thing).
 
  • #11
One other thing is that I get suspicious when people talk about the "good old days". As far as I can tell, academia has been a rat race since at least 1970, and we've *never* had a situation in which there were enough jobs in academia to absorb most of the people coming out.
 
  • #12
twofish-quant said:
<snip>
The problem I have here is that it's throwing out responsibility without giving up any real power. This is not shared governance. Shared governance would be having students elect professors (not that I think it's a good thing).

That is not what shared governance is:

http://chronicle.com/article/Exactly-What-Is-Shared/47065/

""Shared" governance has come to connote two complementary and sometimes overlapping concepts: giving various groups of people a share in key decision-making processes, often through elected representation; and allowing certain groups to exercise primary responsibility for specific areas of decision making."

Shared governance concerns the relationship between faculty and administration. There is student representation as well, via the elected student government officers.
 
  • #13
If you had people do things for "pure learning" I don't think you'd get enough money to fund professors salaries. Now, you might argue that society shouldn't be based on money, but at that point it's such a big a radical change that my imagination fails me as to how things are run.

As long as you expect professors to get paid, it's going to be a business.

This makes sense, and something that can be all too easy to start denying is that the research and education world seem to be tied together because universities can bring in money by getting their teaching work done as cheaply as possible.

If you hire a researcher full time, you have to pay the researcher enough for him/her to scrape by at least barely, and if on top of that you have to pay a teacher to scrape by, that's simply a bad deal.

As for funding pure research by itself, it exists. One can read Feynman or know just from basic knowledge of a career in academia that there are celebrated positions where no teaching is required, but not surprisingly, they're also near-impossible positions to secure, even more so than traditional professorships.

An additional issue which twofish addresses is something that I've felt is a bit unfortunate -- when one really thinks about it, it's really kind of random that the universities are supporting postdocs and such on the basis of their being cheap teaching labor in addition to the benefit of what they produce as academic members (why else would there be so many dirt cheap postdocs and so little desire to give positions with more security?). Why couldn't the postdocs do something else with their time to get funding? The answer seems to be exactly that part-time arrangements are not something most employers will deal with. In relation to the point about coordinating two 20 hour jobs as opposed to 1 40 hour job, perhaps this is because in teaching, the semester's work is pretty much left up to the individual running in the course, and he/she needn't be accountable to anyone else, merely making sure TAs submit their grades and stuff on time. Half the time the TA won't even know where the lecture is, and lectures according to some pre-made schedule, or based on student requests occasionally.

The far bigger threat to missing out on brilliant minds is the tremendous career uncertainty, the long training period and low payoff, etc.

This is also a really good point; as far as I can tell, the PhDs who DO continue on to take postdoc positions and don't leave academia aren't always clearly the brightest minds or the ones with the most interesting ideas. They're the ones who will put up with the system.

Sure, if you're talking about MIT computer science or so, then it's a high likelihood that the people taking positions there are most likely around on basis of amazing contributions.

At some of the less competitive schools, you can have people just as great, and everyone they hire is probably quite good. But I get the feeling the people "who make it" in academia are not necessarily the "best" in any meaningful sense. A lot of extraordinarily bright people who don't want to maintain a publish or perish lifestyle amidst poor pay and no geographic certainty leave the academic job market, something I wasn't as aware of until recent years.
 
  • #14
I am particularly interested in hearing comments on this from the article:

Most of the possible objections from our individual point of view – the other job has little to do with our research specialization, it would take time and effort to learn it, and so on – apply equally well to teaching.

It's pretty easy to teach basic quantum mechanics if one is researching advanced topics related to that, but preparing a lecture still takes time! Modern research isn't conducted on the same wavelength or format as the textbook.

I imagine someone who has to constantly learn new things and think about them in various ways can learn something else that is simpler and still provides benefits to someone else.

Adding a non-teaching involvement would still mean an exceptionally overcrowded day, conceivably making it hard to spend time on family, etc.
 
  • #15
People are willing to spend money on a university because they think that they can make that money back. This is true for government grants, research funding

Ah, now you hit a point I'm really curious about. How do those who provide research funding expect to profit from doing so? I never really understood that.

I can understand if X company put money into a research group that develops algorithms that can be used in several years to maybe come up with an amazing product that will make the company a ton of money. But why fund say, pure math research, or string theory research?

There are two sources of money that I can think of: government, and university money, which fund such disciplines that have no immediate connection to nearly all of industry. I already find a gap in my understanding when I ask how the university exactly benefits from having such smart researchers - what exactly does it buy them apart from name and fame?
 
  • #16
deRham said:
Ah, now you hit a point I'm really curious about. How do those who provide research funding expect to profit from doing so? I never really understood that.

1) Back in the cold war it was simple. If we don't spend money on physicists and those EEEEVVVVIIILLLLL Russkies build better bombs and toasters, we will all be waving red flags and quoting Lenin. Hence lots of cash for physicists.

2) There is the argument that if you spend cash on physicists, you create new industries and ultimately that means more jobs, tax revenue, good stuff.

Also the fact that people ***don't*** see the connection is why basic research has gotten cut. What I'm hoping to see in the next decade is some "friendly competition" between the US/China/India over who can spend more money on science for the purpose of economic growth.

But why fund say, pure math research, or string theory research?

Because all that weird stuff turns out to be useful in a few years or decades.

I already find a gap in my understanding when I ask how the university exactly benefits from having such smart researchers - what exactly does it buy them apart from name and fame?

Name and fame -> cold hard cash -> more name and fame

There is a lot in common between the Hollywood system and academia, in that you have stars. If I go to Congress and say, you must give X several tens of billions of dollars or else something really bad will happen, no one is going to listen to me. If a Nobel prize winner does it, then there is a good chance that they will get the cash.
 
  • #17
deRham said:
This makes sense, and something that can be all too easy to start denying is that the research and education world seem to be tied together because universities can bring in money by getting their teaching work done as cheaply as possible.

This is called "cross-subsidization." Technology is going to make this much more difficult. MIT can force people to pay huge amounts of money for a basic calculus course in order to support research, but if you put everything on the internet, that's going to be difficult/impossible.

it's really kind of random that the universities are supporting postdocs and such on the basis of their being cheap teaching labor in addition to the benefit of what they produce as academic members (why else would there be so many dirt cheap postdocs and so little desire to give positions with more security?)

Postdocs don't have a huge amount of teaching duties, but they are cheap researchers. One problem I react badly to terms like "shared governance" is that a university looks a lot more like a factory (or even a sweat shop) if you think of graduate students and post-docs as workers and tenured faculty as management. There is this ideal of the university as the "community of the mind" in which people have freedom and that social status isn't that important. This only works if you ignore graduate students and post-docs and adjuncts.

The basic problem is one that mankind has faced for thousands of years which is that the "life of the mind" is available only for the wealthy since someone has to work the fields. You'd think with technology that there would be more equality, but it doesn't seem to be working out that way.

But I get the feeling the people "who make it" in academia are not necessarily the "best" in any meaningful sense.

It gets tauntological. People that make it in academia are the best in the sense that they make it in academia. Now whether or not this correlates as being "better" in some other sense, is another question.
 
  • #18
deRham said:
It's pretty easy to teach basic quantum mechanics if one is researching advanced topics related to that, but preparing a lecture still takes time! Modern research isn't conducted on the same wavelength or format as the textbook.

The other thing is that it's relatively easy to prepare a class if all you are doing is to teach things the same way that you've taught it in the last decade. If you want to do something new and different, at that point it takes a huge amount of time and effort.

Also the thing about textbooks is that someone has to write the textbook. Also someone has to think about what a "textbook" means in with all of this technology that is coming down the pipe.

Something that is interesting is that it wasn't until the 1960's that people even tried to teach quantum mechanics to undergraduates. One good thing about MIT is that there is a lot of interesting "cutting edge" stuff that's going on there as far as physics education. It's now "standard" for undergraduates to be involved in research, but this was unheard of in 1970, and someone had to come up with that idea (and I was lucky to have known the person that invented that.)

I imagine someone who has to constantly learn new things and think about them in various ways can learn something else that is simpler and still provides benefits to someone else.

I don't think that's true. I found that out teaching Algebra. One problem with me teaching Algebra is that Algebra is trivially easy for me, which means that it's often difficult and frustrating to put myself in the shoes of someone for which it doesn't made sense.

Adding a non-teaching involvement would still mean an exceptionally overcrowded day, conceivably making it hard to spend time on family, etc.

And family is important. Most of my science/math teaching actually involves helping my kids to their homework. The other thing is that I'm going to have 100000x more impact on how my kids view science and technology than I am against someone else's kids.
 
  • #19
One problem with me teaching Algebra is that Algebra is trivially easy for me, which means that it's often difficult and frustrating to put myself in the shoes of someone for which it doesn't made sense.

Well, that's the whole thing - one doesn't have to provide a benefit to everyone. Even if the student you're teaching stuff to is unhappy, perhaps the university you work for is happy that they kill 2 birds with 1 stone, namely they get cutting edge research done along with teaching their students for nearly nothing.

When you're teaching your kids, it's different of course, because you're teaching not out of profession but out of good will towards the kid. Ideally, the two aren't distinct, but are these things ever ideal...

What I wish is that teaching wouldn't be the one and only major thing you can do "part-time" where the rest of the time you're doing research. Realistically, only a few people seem lucky enough to get positions where they research all day doing something interesting to them. I'd think the next best thing would be to do part of each, but it looks like it's all or nothing.

it's relatively easy to prepare a class if all you are doing is to teach things the same way that you've taught it in the last decade.

One of the unfortunate things is that postdocs will have to keep switching schools, switching what their non-research responsibilities (both towards the university in terms of teaching various subjects and in terms of adjusting to a new area) are.

If there were one thing I could point to that I wish could be eliminated, it isn't even having to teach - it's being forced out of a certain location by default after a short period.
 
  • #20
ParticleGrl said:
The far bigger threat to missing out on brilliant minds is the tremendous career uncertainty, the long training period and low payoff, etc. The labor market for science works like any other- and it should be clear that many brilliant minds turn their talents toward better prospects.

deRham said:
At some of the less competitive schools, you can have people just as great, and everyone they hire is probably quite good. But I get the feeling the people "who make it" in academia are not necessarily the "best" in any meaningful sense. A lot of extraordinarily bright people who don't want to maintain a publish or perish lifestyle amidst poor pay and no geographic certainty leave the academic job market, something I wasn't as aware of until recent years.

Not a bad thing, is it? There are many sorts of intelligence and many difficult problems that require different personalities outside of academia. "If people do not believe that mathematics is simple, it is only because they do not realize how complicated life is." – John von Neumann

The only problem is that all the good physicists went to finance and crashed the world economy :tongue2:
 
  • #21
Not a bad thing, is it? There are many sorts of intelligence and many difficult problems that require different personalities outside of academia.

It depends on what we want out of the system- if we want the absolute best physicists, its probably a bad thing. Low wages, long training periods, and tremendous career uncertainty drive people away from a field. Science isn't immune to job market pressures and few high-skill job markets have been worse over the last few decades.

If we want a productive system, its probably a bad thing. The current aging professorate/young postdoc disparity means that the overwhelming majority of science is done by the least experienced scientists. After getting a phd under their belt and a few years research experience, instead of hitting their stride people are forced out of their field.

If we want to avoid burning human capital, its probably a bad thing- we are training people for jobs that largely don't exist, which is highly inefficient. Spending a decade learning physics for a career that lasts less than 5 years is enormously wasteful- we have programmers, data-miners, finance quants etc who had to pick up a different skill set on the job and now are slowly forgetting all the physics they learned.

Now, if we believe that learning quantum field theory makes you a better computer programmer, learning thermodynamics makes you better able to price financial derivatives, or learning quantum mechanics makes you better able to predict which lower back pain injuries will cost an insurance company money, then our system makes some sense (I'm find all of these somewhat dubious)- but then we perhaps ought to do more to make the connections more obvious, both for the companies and the students. For whatever reason, companies want to hire programmers who spent a decade programming, not people who spent a decade learning how to calculate correlation functions.

If we want a system that produces science on the cheap, its probably a great system. Grad students are highly skilled and only make about what a minimum wage full-time employee would. Postdocs are advanced degree holders and make less than a median bachelors degree holder.
 
Last edited:
  • #22
ParticleGrl said:
If we want to avoid burning human capital, its probably a bad thing- we are training people for jobs that largely don't exist, which is highly inefficient. Spending a decade learning physics for a career that lasts less than 5 years is enormously wasteful- we have programmers, data-miners, finance quants etc who had to pick up a different skill set on the job and now are slowly forgetting all the physics they learned.

I may be weird here, but I thought that both my undergraduate and graduate programs were *excellent* at training me for the jobs that I ended up doing. The important thing that I learned in graduate school involved very, very quickly learning a new set of skills when the situation calls for it, and that's ended up being really useful.

Most of the specific facts and theories that I learned in graduate school are now rather outdated and obsolete, but if space aliens threatened to invade, I'm pretty sure that you could parachute me to research something and I could come up with something decent very quickly.

Also one thing that I like about my job is that I really have no idea what it's going to be like in two years.

We perhaps ought to do more to make the connections more obvious, both for the companies and the students.

It's already obvious for a lot of companies. One thing that I figured out pretty quickly is that trying to convince an employer that a physics Ph.D. is useful is not useful. If they aren't already convinced, you are not going to make a difference. Now connecting Ph.D. students with companies that do like hiring Ph.D.'s is something else.

For whatever reason, companies want to hire programmers who spent a decade programming, not people who spent a decade learning how to calculate correlation functions.

Since I spent most of my Ph.D. dissertation in front of a computer trying to get astrophysical CFD models to work, that worked out well for me. YMMV. I started programming at age six with my TRS-80 Model I, so by the time I was looking for work, I could point to two decades of programming experience.
 
  • #23
AlephZero said:
... except that much of what passes for "univserity education" isn't education, but high level vocational training. For evidence, pick a random sample of posts on PF and analyse the questions being asked.

That's a philosophical issue, but at my alma mater the philosophy was that a deep "university education" *was* high level vocational training. Mind and Hands.

Of course there is nothing at all wrong with high level vocational training, but calling things by their right names is the first step torwards understanding - even if politicians and marketing consultants usually disagree.

You'll find that "naming" is quite tricky. One thing that Chinese philosophers figured out several thousand years ago was that the power to define words is one of the core aspects of power. Also, I get this weird sense when people say "of course there is nothing wrong." If there wasn't anything wrong, then why make the point that there is nothing wrong.

It seems entirely possible that somebody will invent a more effective way to deliver training than the current model, which was invented centuries ago and often only uses 21st century technology as a replacement for 16th century technology (e.g. give a 16th century style lecture, but with a 21st century video camera pointing at the lecturer.)

The current university model in the United States only dates back to the 1950's-1960's at the earliest. A lot of the way that universities worked in 1800 would be totally unrecognizable. The concept of "grades", "curricula", and "degrees" is a 19th century invention, and the idea that most people should go to university didn't hit until the 1950's.

Also lectures originated not because they were particularly good ways of presenting information, but that if you had one big book and no printing, then having people sitting around copying stuff as the reader read something.
 
  • #24
twofish-quant said:
That's a philosophical issue, but at my alma mater the philosophy was that a deep "university education" *was* high level vocational training. Mind and Hands.

Hand, I think?
 
  • #25
atyy said:
The only problem is that all the good physicists went to finance and crashed the world economy :tongue2:

There were a large number of physicists that were involved in *saving* the world economy. Things could have a lot, lot worse than they were.

I knew of one astrophysics Ph.D. who was a senior manager and before the crisis he was always talking about how important risk was, and how important it was to *think* about what we were doing and to use *common sense* to figure out what was really going on. The results of that is that when everything went pear shaped, his company managed to be part of the solution rather than part of the problem.

People outside the industry probably have never heard of him, but I believe that he did much, much, much more good being where he was and doing what he did then if he was an astrophysics professor.

Also there are a lot of interesting physics-like problems in finance. Most people think of money as a physical thing, and that's just wrong. Money is not a physical object, and thinking about the dynamics of money gets you into a lot of physics-like problems.

For example, suppose you have a room full of gold. Someone shows up delivering pizza. Since they won't take gold, you have to do some exchanges to get the pizza. If those mechanisms stop working then you have a problem. Quantifying this situation is something that people are very interested in right now, and if you think of money as a "weird force" or as "weird energy" then you basically use the same thought processes that you would use in thinking about physics problems.
 
  • #26
twofish-quant said:
I knew of one astrophysics Ph.D. who was a senior manager and before the crisis he was always talking about how important risk was, and how important it was to *think* about what we were doing and to use *common sense* to figure out what was really going on. The results of that is that when everything went pear shaped, his company managed to be part of the solution rather than part of the problem.

Ok, we excuse the astrophysicists. Have you anything good to say about the string theorists? (just kidding;)

Actually, why weren't the physicists with common sense able to prevent this in the first place? Was common sense not heeded or rewarded? Did no one predict it? Or was it predictable and unavoidable?
 
  • #27
There's an interesting dutch documentary on the "contribution" of scientists to the financial industry : "Quants : The Alchemists of Wall Street".
You can easily find it on youtube... No need to predict and avoid anything, they made it and they recognize it... Mad models for a mad economy...
To incriminate there background would be a bit harsh though... They used the possibilities deliberately offered by policies and regulations...
 
Last edited:
  • #28
  • #29
atyy said:
Actually, why weren't the physicists with common sense able to prevent this in the first place? Was common sense not heeded or rewarded? Did no one predict it? Or was it predictable and unavoidable?

Depends on the company. If you work in the trenches, and in 2005, you came to the conclusion that your employer was being idiotic, the only thing that you could reasonably do is to resign and work for someone else. As of 2005, the regulators weren't going to listen to you and neither was the press.

Unfortunately, this meant that there was a "reverse Darwin" effect as people with clue left some places which made it even easier for them to do stupid things. One problem was that the incentives were wildly skewed. The fewer annoying risk managers you hired, the more money you could make by doing stupid things.

One good thing that has come out of this is that regulators care more about what's going on now, so if you have a problem with risk management at a bank today, the regulators *will* listen to you.
 
  • #30
nazarbaz said:
There's an interesting dutch documentary on the "contribution" of scientists to the financial industry : "Quants : The Alchemists of Wall Street".

A lot of the discussion about what physics Ph.D.'s actually do on Wall Street is seriously misinformed and wildly out of date. Also, if you build a bridge that blows up, everyone notices, whereas if you build a bridge and it just works, no one knows who you are.

In any case, I looked at the documentary, and it's a decent snapshot of what the industry looked like in a few years ago, but it's painfully, painfully outdated for what people are doing today. The name of the game today is simplicity and risk management, and a lot of what a Ph.D. would do involves writing reports for the banking regulators. Now the data that the regulators want involves some horrendously nasty math. For example, you might have someone from the Federal Reserve that wants to know what happens if the stock market drops 50%, and giving him or her the answer to that question involves a ton of computational and mathematical modelling.

No need to predict and avoid anything, they made it and they recognize it... Mad models for a mad economy...

Which is utter nonsense for what people really do. If your equations are bad, then *bad things will happen*. If you model things incorrectly then *bad things will happen*.

To incriminate there background would be a bit harsh though... They used the possibilities deliberately offered by policies and regulations...

That doesn't get you off the hook. Once thing that you will be involved in directly or indirectly if you go into finance is in helping to design the regulatory structure.
 
Last edited:
  • #31
twofish-quant said:
That doesn't get you off the hook. Once thing that you will be involved in directly or indirectly if you go into finance is in helping to design the regulatory structure.

That raises a question. When derivatives were changed from being regulated under gambling laws to being unregulated, what did the people in the positions like yourself and your senior manager think and say when this kind of thing happened?
 
  • #32
chiro said:
That raises a question. When derivatives were changed from being regulated under gambling laws to being unregulated

Most of that happened in the late-19th century before my time.

However, unless you are very careful about regulations, they tend to be irrelevant. Suppose every country in the world outlaws derivatives expect for South Elbonia. At that point what you can do (and what people do do) is to write all of the contracts in South Elbonia. One thing about money is that since everything is in cyberspace, if you want to do something that can't be done under US law, you just virtually move the money to somewhere else in the world, and do it there.

In fact, the US has extremely (and has always had) strict restrictions on derivatives. England doesn't, which is why most derivative transactions end up happening in London. Thanks to the internet, it doesn't matter since you can physically be in NYC and then "virtually" have all of the transactions take place in London,, and subject to English law.

The current set of regulations are a bit different because they come from the Basel Committee. Basically all of the world's major regulators have gotten together and hammered out some principles which everyone needs to comply with. Banking regulation *has* to be done globally and with consensus from *all* of the major nations.

The technical term for this sort of thing is "regulatory arbitrage."

What did the people in the positions like yourself and your senior manager think and say when this kind of thing happened?

Nothing to think about, since it turns out to be largely irrelevant. US law doesn't effect transactions in London.

Funny story since London stayed a financial center after the British Empire fell specifically so that US banks could get around US banking restrictions in the 1960's. One reason that the US had to get rid of a lot of banking regulations in the 1970's was that they were largely irrelevant, since people were doing stuff in London anyway. One of the big drivers of the London banking industry was curiously the Soviet Union that wanted to keep its dollars in London banks since (for obvious reasons) they didn't think it would be a good idea to have Soviet money in the US. Hong Kong become a financial center for similar reasons (i.e. Chinese Communists would rather keep their dollars here in the 1960's).

One thing about finance is that you quickly find interesting cultural and legal differences. For example, the US has a strongly moralistic culture that considers gambling (whether in a casino or a stock market) to be "sinful" whereas the English don't have that attitude. So the US and English banking systems are different in some pretty fundamental ways. The US prohibits most types of gambling, whereas English bookmakers will let you put bets on just about anything. The US prohibits selling derivatives to small investors, whereas this is done routinely in Europe where derivative securities play the same sort of role as mutual funds.

So how US law treats derivative securities turns out to be rather unimportant, because everything happens in London, and the English aren't about to change their laws just to make the US happy. Also, you could theoretically impose restrictions on moving money between the US and other countries, but that turns out to be too messy. China can do that, but if you make it hard to move dollars between different countries, then people will stop using dollars.

Also gambling is a state issue, which means that as long as it is o.k. in New York, it doesn't matter. If NY state got suicidal passed laws that really made it impossible to do business in NYC, then Las Vegas is going to becomes a financial center, and good luck convincing Nevada to ban gambling.

(Every wonder why credit card companies are all located in South Dakota?)

Finally, international regulation is effective in some things. One reason Al-Qaeda hasn't been able to do anything is that *no one* will take AQ money. Similarly, the North Koreans got a lot nicer once it was clear that the international banking system would squeeze them. So if you are a terrorist or building nukes, then no one will do business with you. If you want to get around gambling laws, then lots of people will.
 
Last edited:

1. What is the typical day like for a professor?

A typical day for a professor varies depending on their field of study and their specific responsibilities. Generally, a professor's day is filled with a mix of teaching, research, and administrative tasks. They may have lectures or seminars to teach, meetings with students or colleagues, and time dedicated to conducting research and writing publications. They may also have administrative duties such as serving on committees or attending departmental meetings.

2. How does one become a professor?

Becoming a professor typically requires obtaining a doctoral degree in a specific field of study. This involves completing a bachelor's degree, followed by a master's degree, and then a doctoral program. After obtaining a doctoral degree, aspiring professors often gain teaching experience as a teaching assistant or adjunct professor before applying for tenure-track positions. They must also have a strong research background and a track record of publications in their field.

3. What are the benefits of being a professor?

Being a professor comes with many benefits, including job security, a competitive salary, and the opportunity to make a positive impact through teaching and research. Professors also have a flexible schedule, allowing them to balance their work and personal life. They also have access to resources and funding for their research projects and the opportunity to collaborate with other experts in their field.

4. What are the challenges of being a professor?

Being a professor also comes with its own set of challenges. The workload can be demanding, with the pressure to balance teaching, research, and administrative duties. The competition for tenure-track positions can also be intense, and the process of obtaining tenure can be stressful. Additionally, professors may face challenges with funding for their research and the pressure to publish in high-impact journals.

5. What qualities make a successful professor?

Successful professors possess a combination of academic expertise, strong communication skills, and a passion for teaching and research. They must also have excellent time management and organizational skills to balance their various responsibilities. Additionally, successful professors are dedicated to continuous learning and staying up-to-date in their field to provide their students with the most current and relevant information.

Similar threads

  • STEM Career Guidance
Replies
6
Views
1K
  • STEM Career Guidance
Replies
6
Views
3K
  • STEM Career Guidance
Replies
13
Views
4K
  • STEM Career Guidance
Replies
33
Views
13K
  • STEM Career Guidance
Replies
24
Views
11K
Replies
7
Views
1K
  • STEM Academic Advising
Replies
27
Views
2K
  • STEM Career Guidance
Replies
1
Views
2K
  • STEM Career Guidance
Replies
3
Views
2K
Replies
3
Views
2K
Back
Top