If the education bubble collapses, will it be the end of blue skies research ?

In summary: Ivies and top universities may still survive, but a lot of the state universities may collapse). It only takes a critical threshold of competent workers who didn't go to school to make a considerable number of employers stop demanding university degrees. And once this happens, many people will just end up not going to college (and pursue internships, online education, and some self-study instead). In summary, the concern is that if the education bubble collapses due to the rise of cheaper and more accessible online learning options, it may lead to a decrease in state funding and cause a decline in "blue skies" or speculative research, particularly in fields like astrophysics. This is because a large portion of a professor's
  • #1
Simfish
Gold Member
823
2
If the education bubble collapses, will it be the end of "blue skies research"?

Okay, I know that this topic can be very sensitive to some people, and I really do not want to offend anyone. But it is something that concerns me, as someone who's interested in astrophysics, and I would like to think about potential changes before they arise.

The main assumption of this concern, however, is based on the assumption that professors earn a considerable portion of their income from undergraduate tuition - or from teaching salaries. While much of the university's income also comes from state funding, support for such state funding primarily comes from the assumption that the university educates the state's citizens. But when the university is no longer deemed necessary for educating the state's citizens, much of this state funding might dry up too.

If this is not the case, then the collapse of the education bubble may not be so much of a concern. There are grants, of course, but grants are mostly supplementary income rather than primary income.

So what I'm thinking is this - there is a distinct possibility that the university (the education bubble so to speak) might collapse in the future, because online learning (and knowledge testing) will become much cheaper and more readily available (U Phoenix is hugely overpriced, and there are already models coming up that are far cheaper than UPhoenix - in fact - many smart people can learn most of the required material for many classes [even if not all] by just reading the textbook and doing the exercises). This may also result in cheaper systems of certification (which are like AP tests, but applied to college subjects). Many people say that online learning/self-studying is "not as good" as regular instruction, but the vast majority of students who go to college won't pursue academic research in their major, in which case employers might not care as much about whether or not they got a "proper education". Grades are a means of signaling a combination of innate ability, knowledge, and conscientiousness - and some economists believe that this signaling is why employers demand college degrees as of now (even though many workers don't end up using the skills they learned in college). But when innate ability, knowledge, and conscientiousness can be signaled through means other than the university, then many people will pursue less costly alternatives and the very model of the university may collapse (the Ivies and top universities may still survive, but a lot of the state universities may collapse). It only takes a critical threshold of competent workers who didn't go to school to make a considerable number of employers stop demanding university degrees. And once this happens, many people will just end up not going to college (and pursue internships, online education, and some self-study instead).

And if this happens, I'm concerned that it may be the end of a lot of "blue skies" research. Or research that comes largely without economic application. This is especially true for astrophysics. Of course, some people in certain subfields of astrophysics can easily find another field that uses their skills. But this is easier for some subfields (astrostatistics, computational astrophysics) than it is for other subfields.
 
Last edited:
Physics news on Phys.org
  • #2


I think this is a long way off. Someone still has to be at the other end of the computer, right? I mean, they don't let just anybody run online classes.

And besides, a lot of speculative research is covered under Dept. of Energy/National Science Foundation grants. At least, that's where my paycheck comes from :)
 
  • #3


Right. Jocelyn Bell was a professor at the Open University, which was way ahead of Phoenix.
 
  • #4


Simfish said:
Okay, I know that this topic can be very sensitive to some people, and I really do not want to offend anyone. But it is something that concerns me, as someone who's interested in astrophysics, and I would like to think about potential changes before they arise.

The main assumption of this concern, however, is based on the assumption that professors earn a considerable portion of their income from undergraduate tuition - or from teaching salaries. While much of the university's income also comes from state funding, support for such state funding primarily comes from the assumption that the university educates the state's citizens. But when the university is no longer deemed necessary for educating the state's citizens, much of this state funding might dry up too.

If this is not the case, then the collapse of the education bubble may not be so much of a concern. There are grants, of course, but grants are mostly supplementary income rather than primary income.

So what I'm thinking is this - there is a distinct possibility that the university (the education bubble so to speak) might collapse in the future, because online learning (and knowledge testing) will become much cheaper and more readily available (U Phoenix is hugely overpriced, and there are already models coming up that are far cheaper than UPhoenix - in fact - many smart people can learn most of the required material for many classes [even if not all] by just reading the textbook and doing the exercises). This may also result in cheaper systems of certification (which are like AP tests, but applied to college subjects). Many people say that online learning/self-studying is "not as good" as regular instruction, but the vast majority of students who go to college won't pursue academic research in their major, in which case employers might not care as much about whether or not they got a "proper education". Grades are a means of signaling a combination of innate ability, knowledge, and conscientiousness - and some economists believe that this signaling is why employers demand college degrees as of now (even though many workers don't end up using the skills they learned in college). But when innate ability, knowledge, and conscientiousness can be signaled through means other than the university, then many people will pursue less costly alternatives and the very model of the university may collapse (the Ivies and top universities may still survive, but a lot of the state universities may collapse). It only takes a critical threshold of competent workers who didn't go to school to make a considerable number of employers stop demanding university degrees. And once this happens, many people will just end up not going to college (and pursue internships, online education, and some self-study instead).

And if this happens, I'm concerned that it may be the end of a lot of "blue skies" research. Or research that comes largely without economic application. This is especially true for astrophysics. Of course, some people in certain subfields of astrophysics can easily find another field that uses their skills. But this is easier for some subfields (astrostatistics, computational astrophysics) than it is for other subfields.

This is an incredibly perceptive post.

To your initial assumption: You are basically correct. However, there is a subtle aspect. American universities have evolved over the past centuries toward the following model: undergraduate education follows the OxBridge model, while graduate education follows the German model.

This topic is well explored in "Academic Charisma and the Origins of the Research University"

https://www.amazon.com/dp/0226109224/?tag=pfamazon01-20

This has led to two different *financial* models as well. In the main, tenure-track and tenured faculty receive most or all of their salary from the school (however the school gets their money- that's a *whole* 'nuther issue), while non-tenure track (i.e. 'research' faculty) are generally paid 100% off of grant money- what is known as 'soft money'. Graduate students (ones that receive a stipend and or tuition) are paid off of 'soft money'.

The essence of your thesis (possibility that the university [...] might collapse in the future, because online learning [...is] much cheaper and more readily available) is correct as well- in order for the Business of Universities (and it is very much a *business*) to remain competitive with online content, the business model will have to change.

One strategy is "if you can't beat 'em, join 'em"- the University attempts to cut costs by replacing classes with online versions.

Another is vague arguments about 'quality of educational experience' (for example, on-campus social organizations) as a way to claim a 'value added' component.

The point is, all Universities are trying to figure out ways to compete with organizations like U Phoenix (which is for profit, BTW- Universiities are non-profits), and nobody has a good solution yet. Certainly some Universities and Departments are more 'vulnerable' than others.

That said, 'blue sky' research is 100% (+/- 5%) funded by the US government, and generally forms a small fraction of the total financial picture of any particular University- certainly more is better (and people living off 'soft money' are constantly vulnerable to termination). But as far as the University is concerned, grant dollars are about 'prestige', not a means of income.

So your concern about (say) astrophysics research goes away should be directed the US Congress- the people who write the checks. It sounds silly, but writing your elected representatives and telling them about *your project*- how many jobs are supported off it, how awesome it is, etc... is highly effective. The rep (most likely) will not read your letter- but they have an employee whose job it is to keep them educated about science issues, and that person *will* read the letter- you may even get a response back.

Universities are not going away- although Universities do close- but Universities in the future are going to look different than they do today. As they look different now, compared to 50 years ago.
 
  • #5


Simfish said:
So what I'm thinking is this - there is a distinct possibility that the university (the education bubble so to speak) might collapse in the future, because online learning (and knowledge testing) will become much cheaper and more readily
available.

The big fallacy here is that online education is cheaper. It's not. Online education is different, but to do it right, it's quite a bit more expensive. Universities that have gone into online education with the expectation that it will save money have gotten their heads handed back to them.

(U Phoenix is hugely overpriced, and there are already models coming up that are far cheaper than UPhoenix - in fact - many smart people can learn most of the required material for many classes [even if not all] by just reading the textbook and doing the exercises).

And I'm pretty sure that most of those cheaper models won't work.

Yes, you *can* learn a lot of material by reading a textbook, but in order to learn deep knowledge you need a huge amount of social interaction. Also, once you read the textbook, how are you going to get that knowledge into a form that you can put on a resume so that you can make money off of it?

I'm pretty sure that all of the raw material for a physics degree is online, but getting all of that raw material into a finished product is hard and expensive.

This may also result in cheaper systems of certification (which are like AP tests, but applied to college subjects).

Certification is quite expensive. ETS is as big a cash cow as University of Phoenix is. The thing about ETS is that because they use standardized tests, they can spread the costs of certification across huge numbers of people, so the cost per student ends up being low. However, there are limits to how far this will work. There are some things that just can't be certified using standardized tests, and once you have a standardized test, the bureaucracy involved can be quite high.

Many people say that online learning/self-studying is "not as good" as regular instruction, but the vast majority of students who go to college won't pursue academic research in their major, in which case employers might not care as much about whether or not they got a "proper education".

Personally, I think that a properly done system of online learning/self-studying can be *better* than the traditional educational model, but doing it right is expensive, and part of the difficulty in getting a viable system is to figure out how to get the financing to work.

What employers do care about is branding. They want a word on the resume that they can use to quickly sort potential employees. Also universities do a lot more than teach. Part of the reason that people pay large amounts of money to Harvard for an MBA is that Harvard will market you.

But when innate ability, knowledge, and conscientiousness can be signaled through means other than the university, then many people will pursue less costly alternatives and the very model of the university may collapse (the Ivies and top universities may still survive, but a lot of the state universities may collapse).

If those less costly alternatives exist. I'm not convinced that they do. Also I'm not worried too much about state universities, because they usually have a huge amount of legislative backing, and in some situations, they have been able to convince people that they are essential drivers of economic growth. UTexas Austin is the core of the research environment of Austin and everyone knows it.

It only takes a critical threshold of competent workers who didn't go to school to make a considerable number of employers stop demanding university degrees.

I don't think this is going to happen soon.

And if this happens, I'm concerned that it may be the end of a lot of "blue skies" research. Or research that comes largely without economic application. This is especially true for astrophysics. Of course, some people in certain subfields of astrophysics can easily find another field that uses their skills. But this is easier for some subfields (astrostatistics, computational astrophysics) than it is for other subfields.

There is no such thing as highly paid research without economic or military applications.

People pump huge amounts of money into astrophysics for a reason, and it's not because of pure love of learning. The fact that ability of the US to dominate global politics is dependent crucially on understanding making bombs based on nuclear fusion and prevent other people from making bombs based on nuclear fusion means that you are going to get a lot of funding in astrophysics for a long time.
 
  • #6


Andy Resnick said:
American universities have evolved over the past centuries toward the following model: undergraduate education follows the OxBridge model, while graduate education follows the German model.

I think most of it happened post-WWII.

The essence of your thesis (possibility that the university [...] might collapse in the future, because online learning [...is] much cheaper and more readily available).

Except that online learning *isn't* cheaper. The main thing about online education is that it allows people to communicate and it makes static information less important. In order to get someone to pay money, you have to provide *dynamic* information and that usually involves hiring a human being.

Now it may be possible to incorporate the internet into different business models, but that's something quite different.

The point is, all Universities are trying to figure out ways to compete with organizations like U Phoenix (which is for profit, BTW- Universiities are non-profits), and nobody has a good solution yet.

I don't see MIT, Harvard, UT Austin has being any less non-profit than University of Phoenix.

But as far as the University is concerned, grant dollars are about 'prestige', not a means of income.

It really depends on the university. Most of MIT's income comes from sponsored research and industrial programs. Undergraduate education is a side-line. Different universities have different priorities.

So your concern about (say) astrophysics research goes away should be directed the US Congress- the people who write the checks. It sounds silly, but writing your elected representatives and telling them about *your project*- how many jobs are supported off it, how awesome it is, etc... is highly effective.

Personally, I don't think that writing letters is all that effective. Writing ten thousand letters could be, but that involves a large degree of organization, which fortunately exists.

In the case of astrophysics, I wouldn't be too worried. One of the jobs of a senior scientist is to go to Washington D.C. and meet with Congressman to lobby for more funding. One important part of your graduate education is to start to understand the process. One of the functions of a Nobel prize winner is to be a "rainmaker." You hire a Nobel prize winner because if a Nobel prize winner wants to meet with a congressman, the Congressman is probably not going to say no.

Personally, I think it would be more effective if instead of writing letters yourself, you get more actively involved in the professional societies (AAS). AAS and AIP have very well oiled lobbying machines with offices in Washington DC. Something that I think that every graduate student should do is to attend AAS and AIP conferences even if they have to pay their own plane ticket.
 
  • #7


twofish-quant said:
I don't see MIT, Harvard, UT Austin has being any less non-profit than University of Phoenix.

We've been over this: MIT is chartered as a non-profit organization and is owned and governed by a privately appointed board of trustees known as the MIT Corporation. UT Austin is a branch of the State of Texas.

This means those organizations are exempt from paying income tax, and contributions to the organizations are also tax-exempt.

http://vpf.mit.edu/site/tax_services/policies_procedures/tax_overview
http://runningofthehorns.com/old/07-08/docs/501c3_info.pdf

http://en.wikipedia.org/wiki/University_of_Phoenix
 
Last edited by a moderator:
  • #8


Andy Resnick said:
We've been over this: MIT is chartered as a non-profit organization and is owned and governed by a privately appointed board of trustees known as the MIT Corporation. UT Austin is a branch of the State of Texas.

Which as far as I'm concerned is an organizational detail that tells very little about the internal politics and motives of the organizations. It's quite possible for a non-profit to have a for-profit subsidiary, and it's also quite possible for a for-profit to effectively control a non-profit.

To name examples of for-profit subs of non-profits. Duke Corporate Education. NYUOnline and Fathom. There's also the Chauncey Group which is the for-profit arm of ETS. To make examples of non-profit subs of for-profits. Pretty much every Fortune 500 corporation has some associated non-profit.

Also, there is no rule legal or otherwise, that a university can't be a for-profit corporation, which is an interesting contrast to doctors and lawyers who can't legally form for-profit corporations.
 
Last edited:
  • #9


University research won't disappear within the time frame of your working career. So I wouldn't sweat it.

Faculty jobs could be fewer. But, as mentioned above, most research is fully funded by the government. The universities take ~25% of that funding from the researches right off the top. A large part of what's left goes towards graduate students (pays their tuition and stipends and such). So universities typically want as much research money as possible coming through.

But I totally agree with the overall irony of Universities charging more and more for a product (information) that is becoming increasingly free. I think for many people college is more about the credential than the knowledge though.
 
  • #10


twofish-quant said:
Which as far as I'm concerned is an organizational detail

It's a *huge* difference- a fundamental difference in the organization and structure of a business. Someone involved in business should understand that.
 
  • #11


diggy said:
I think for many people college is more about the credential than the knowledge though.
College is all about the credentials and networking. If teachers were truly necessary for the learning process, then there would be way more money going into R&D for education and teaching methods, but this isn't the case. Another problem is that in countries like USA is that as the GDP per capita increases, the potential wages for good teachers drops and as a result, talented people end migrating to other fields.
 
  • #12


diggy said:
The universities take ~25% of that funding from the researches right off the top. A large part of what's left goes towards graduate students (pays their tuition and stipends and such). So universities typically want as much research money as possible coming through.

That's not exactly correct. My proposed budgets have a line "indirect costs". "Direct costs" represent 'my' money- as a point of fact, it's *all* the institution's money, but for practical purposes, direct costs are the monies that I can spend as I wish (salary, graduate tuition and stipend, supplies, travel, etc) in carrying out the covered research.

Indirect costs are also referred to as 'overhead', and is a percentage of the direct costs. Here at Cleveland State, the negotiated indirect cost rate for federal grants is 42% of the modified total direct costs. Other agencies (state, local, foundations, etc) pay a different indirect rate, generally a much lower rate- say 10%. Some federal training grants also have a much lower indirect cost rate.

What the institution does with the indirect is largely up to them. Here, 5% of the indirect is returned back to me, 15% goes to my Department, 10% goes to my College, 20% goes to the Provost's office for research, and 50% goes to the University.

As an example, I'm submitting a proposal that over 5 years asks for $367k in direct costs (personnel (16% of my time, 1 FT grad student, 2 PT undergrad students) is about 1/2, supplies the other 1/2), and thus also bills the government for $135k over 5 years as the indirect cost.

In theory, the indirect covers the cost to the university for my research- utilities, rent, support personnel (janitors, secretaries, administrative staff, etc.), but I've heard that the break-even point requires annual direct costs of around $750k *per lab*. Detail-oriented bean counters typically work with units of "$/sq. ft" in determining the 'efficiency' of a research program. Consequently, university administration typically sees research grants as a money *loser*- hence the need to justify research in terms of 'prestige'.

This can often become a source of friction between administration and faculty- and for those faculty totally dependent on grant support for their salary, a huge source of stress.
 
  • #13


Thanks for all the replies and detailed responses, everyone!

This is an incredibly perceptive post.

Thanks, I appreciated that comment.

Yes, you *can* learn a lot of material by reading a textbook, but in order to learn deep knowledge you need a huge amount of social interaction.

I think one thing is this: can you be sure that someone has gotten deep knowledge by going to a university (especially a state university)? Many people effectively end up studying by themselves, and there usually is not that much communication between student and professor (in fact, it's quite common for professors to say that office hours are frequently underutilized). Ideally, one might get a lot of deep information if one did everything perfectly. But a significant number of students don't really get much more information than they'd get out of simply reading the textbook (and this is often true for the top students, too, many of whom can't choose a lot of students to study with - at least in the state universities). Of course, it might just be that employers expect that a university degree predicts a "higher level of deep knowledge", on average (even if many students don't obtain this higher level of deep knowledge). I'm quite unfamiliar with the working world, though, so I don't have much of an idea about how much employers value "deep knowledge".

Another thing is: what if online communities were set up for people who were taking similar courses? (in which case there may not even be a salary that needs to be paid, as everyone is voluntarily doing the same thing) Obviously, there are currently many barriers that make them unequal to the "real thing" (as a result, the facebook groups for communal studying of MIT OCW and Stanford Engineering Everywhere courses are almost always pretty much empty). I think most of us would agree that even Physics Forums can't really be equal to the "real thing". Of course it's not totally inconceivable that this may change in the future (although different people will probably disagree on the probability of this happening within X years).
 
  • #14


Simfish said:
can you be sure that someone has gotten deep knowledge by going to a university

That's a very important question to ask.

First, let's agree on a definition of "Deep knowledge". When I hear or use the term, I think of understanding the underlying conceptual foundations of (say) Physics, which may or may not be correlated with test/homework performance. That is, I assume it is possible that someone may test poorly on exams yet possesses adequate comprehension of the material.

Here's the basic problem: If you assume that standard tools of learning assessment (tests, homeworks, etc..) aren't good tools for measuring "deep knowledge", then you have to develop an assessment tool that can measure the quality you seek: comprehension. David Hestenes developed something called the Force Concept Inventory [http://en.wikipedia.org/wiki/Concept_inventory] which he claims does in fact measure 'comprehension'.

That's all well and good, but it is an open question whether his claim is true. It's possible to argue that the faculty participants were simply teaching to a different test, rather than a significant change in student comprehension.

So personally, my answer to your question is "I don't know if you can." Again personally, I am working to develop my own teaching method as an attempt to change my answer to 'Yes!", and I'm lucky to be around some Master Teachers who are willing to help me.
 
Last edited by a moderator:
  • #15


Wow, the Force Concept Inventory looks very interesting. Well, I don't know about other classes, but in physics, at least, there isn't a lot of room for most of the professors to teach in a way that helps maximize "deep knowledge". Instead, most of the professors I know tend to teach things that can easily be substituted by reading the textbook instead, which is why I've often felt that for the majority of my classes, I could do just as well, if not better, simply by self-studying everything and asking questions to professors or Physics Forums. The ideal approach might be to get students to self-study what's in a book, and then come to class for the "deep knowledge" parts of learning (instead of rephrasing the book's material in class to make it easier to understand, which can just as easily be done in a Demystified/Schaum's Outline book). Of course this might take more time for the student (as in the current situation, many students don't even read the textbook, but simply go to lectures to learn what's in the textbook), but each of the individual classes would be more rewarding in return.

There are definitely some universities that are changing their approach to undergrad education though (MIT, in particular)
 
Last edited:
  • #16


Andy Resnick said:
It's a *huge* difference- a fundamental difference in the organization and structure of a business. Someone involved in business should understand that.

My personally observation having been in for-profits and non-profits is that it makes much less of an organizational and structural difference than it first seems.

Part of the reason for this is that in large corporations, the shareholders have very little power, and the main power is in the senior management. Similarly in most universities, even though the board of trustees has legal power over the institution, the actual power is in the faculty, which as far as I can tell act pretty much like senior management in large corporations.

Again this is personal observation, and reasonable people can disagree with it.
 
  • #17


Simfish said:
I think one thing is this: can you be sure that someone has gotten deep knowledge by going to a university (especially a state university)?

You can't, but you can me sure that they were able to get up, go to class, figure out the bureaucratic system, take orders, submit papers, and play the game. In short, if you have someone that has gotten a bachelors, you can be reasonably sure that they can "play the game" that is needed to be a good little cog in the corporate machine.

This is not a small thing.

Many people effectively end up studying by themselves, and there usually is not that much communication between student and professor

True. However there is a huge amount of communications between students, which is why it's useful to have communities of students in the same location. Also, administering a class is one of those things that requires a *LOT* of work, but a lot of that work happens behind the scenes.

The thing about well run universities and classes, is that you usually don't notice how well run they are, because things just work. However, once you step behind the scenes you find out how difficult it is.

Another thing is: what if online communities were set up for people who were taking similar courses? (in which case there may not even be a salary that needs to be paid, as everyone is voluntarily doing the same thing)

That would be quite good. However it's a lot harder than it sounds. The problem is that within a university, the online community is at best a supplement to courses that are already taught. Once you go between universities, it's very hard to get people studying the same thing to form a critical mass because the curricula are different enough to make that very difficult.

There are ways around this problem.

I think most of us would agree that even Physics Forums can't really be equal to the "real thing".

I don't think that face to face courses are necessarily more "real" than online courses.

The thing is that online education is difficult (just like face-to-face education is difficult).

The other thing is that it's not either/or. It's possible to weave online interaction and face-to-face interaction in new ways. For example, one thing that University of Phoenix has done is to set up massive numbers of satellite campuses.
 
  • #18


@Andy. I was just throwing out ball park numbers, of course you're particular ones will vary. Btw, $750k direct funding maths out to ~$500k to be paid to the school. $500k per lab per year, sounds awfully high (ever consider renting :P ). But if that number is right, then research is a financial loser, but it also means the university is incredibly inefficient. If that number includes your salary and benefits it might not be completely crazy though.
 
  • #19


diggy said:
@Andy. I was just throwing out ball park numbers, of course you're particular ones will vary. Btw, $750k direct funding maths out to ~$500k to be paid to the school. $500k per lab per year, sounds awfully high (ever consider renting :P ). But if that number is right, then research is a financial loser, but it also means the university is incredibly inefficient. If that number includes your salary and benefits it might not be completely crazy though.

It's a tricky topic. But recall that 42% is for some federal grants only- there's lots of different grant dollars coming in, and so the average indirect may be closer to say, 20%. And on top of that, not all direct charges are subject to overhead:

The base (the direct dollar amount to which the F&A rate is applied) will now include all costs with the exception of:
1. Any amount over the first $25,000 of each subcontract (subrecipient agreement) issued by Cleveland State University;
2. Nonexpendable equipment with an acquisition cost of $2,500 or greater and a useful
Life of more than one year;
3. Capital expenditures (buildings, alterations and renovations);
4. Patient care;
5. Tuition remission;
6. Rental/maintenance of off-site facilities;
7. Scholarships; and
8. Fellowships.
Our new federal rates are:
• On Campus: 42% MTDC, 7/1/08 - 6/30/12
• Off Campus: 24% MTDC, 7/1/08 - 6/30/12

Following the flow of money in a larger organization is very tough to do, but the system is not that inefficient. What's been happening over the past 20-30 years (the major change occurred during the doubling of NIH's budget) is that administrators have pushed more and more costs onto the PI and Departments. I don't mean having lab assistants and whatnot, I mean core functions for the school- faculty salary, tuition for *all* admitted students, Xeroxing, phone bills, recruiting and startup costs... the trend is to make each department 'self sustaining'.

While the money was rolling in, that was fine. But now, with success rates hovering around 10% for an NIH R01 and around 15-20% generally, many institutions are finding that they have grown too fast to sustain their own operations. So now there's cases like a successful nontenure-track researcher pulling in $300k-$400k/year on grant dollars, and not only not getting any reward- nontenure means they stay as long as the money comes into pay themselves- but their indirect costs are being used to subsidize tenured (and tenure-track junior) faculty who are not pulling in grants (for whatever reason). This is not good for morale.

The bottom line is that many institutions badly miscalculated: education is a money-loser (which justifies the need for institutions to be non-profits), and grant dollars were seen as 'free income' which they could use to grow. Over time, some institutions became dependent on this income to pay for core functions and are now feeling some pain- which trickles down to the faculty, who are pressured to write more and more applications to chase after the finite dollars.
 
  • #20


twofish-quant said:
I don't think that face to face courses are necessarily more "real" than online courses.

I suppose slightly off-topic but I can't imagine O-Chem Lab being less real than Online O-Chem Lab.

Though I suppose a more general question, how do online universities prepare people for research or for graduate schools as a whole? How would one obtain research experience if there are no labs or anything of the sort? I suppose there are still REU's, but still.
 
  • #21


Andy Resnick said:
The bottom line is that many institutions badly miscalculated: education is a money-loser (which justifies the need for institutions to be non-profits), and grant dollars were seen as 'free income' which they could use to grow.

I think that this has to be amended a but. Education can be a tremendous money maker if you do it right. Witness University of Phoenix and DeVry and Sylan Learning Systems. Harvard B-School is rolling in cash.

I do agree that research universities have badly miscalculated. What happened was that because they were focused on things that generated income from grants, they tended to neglect things like general undergraduate education. If you are researched focused, then teaching Algebra I becomes something of a distraction.

The fact that most research universities neglected this space, left the field open for the for-profit universities. For all it's faults, University of Phoenix knows how to teach Algebra I to 40 year-olds.

The other problem is the Malthusian problem. Professors create Ph.D.'s that create professors that create Ph.D.'s. As long as you have expanding budgets, then that works. Once budgets stop expanding, things fall apart.

Over time, some institutions became dependent on this income to pay for core functions and are now feeling some pain- which trickles down to the faculty, who are pressured to write more and more applications to chase after the finite dollars.

This also has some effects on the culture of academia. I know someone who is a high-power dean of a major university, and if you google her name, you find out that she has an MBA and there are lots of videos of her doing internal talks on grantsmanship. On the one had, that's her job and she is frighteningly good at writing grant proposals, but on the other hand, I do think it's a little sad that the videos are those of her doing talks on the NIH grant process and how to organize the institution to get government money rather than on things like curing cancer or fighting tropical diseases in the third world.

The problem with this is that you end up with an arms race. If you don't become a mercenary university, then you get pushed out, and end up in a death spiral. If I were faculty at "Nice-Guy University" and I don't try to fight "Mercenary University" for NIH funding then I'm going to get plastered by Dean Whathername. So I end up having to hire lobbyists and high powered rain makers.

The statement that I made was that universities and corporations are more similar than they seem. I didn't say that I thought that this was a good thing.
 
  • #22


l'Hôpital said:
I suppose slightly off-topic but I can't imagine O-Chem Lab being less real than Online O-Chem Lab.

It is. If you want to learn physics or chemistry or electronics, you will at some point have to handle real chemicals, instruments, and electronic parts. There is no substitute for having a real object in your hand and trying to get it to work.

There are some reasons why:

1) five senses. A lot of learning science involves doing things that involve all of the senses, sometimes in very subtle ways. You can sometimes tell that something is going right or wrong by how something feels or smells.

2) the real world is messy. The thing about simulations is that they are clean. Things just don't go wrong, whereas the importance of laboratory work is that 90% of the time, something will go wrong. The part doesn't fit. Someone left out a crucial part of an experiment. etc. etc.

So you do very much need "real" experience. However for the purposes of this discussion a lecture is no more "real" than an online interaction.

Though I suppose a more general question, how do online universities prepare people for research or for graduate schools as a whole?

They don't. The major online universities aren't designed for this. I'm not aware of any online program in the United States that is intended to prepare students for graduate work in physics, and for various (good) reasons they aren't trying.

This isn't to say that it can't be done. It's just to say that it hasn't, and I don't see much demand for this in the United States.

I do think that you may see some very interesting things coming out of China and India which has a very different environment.

How would one obtain research experience if there are no labs or anything of the sort? I suppose there are still REU's, but still.

You need labs. The goal is to figure out how all of the adminstratrivia works.
 
  • #23


twofish-quant said:
I do agree that research universities have badly miscalculated. What happened was that because they were focused on things that generated income from grants, they tended to neglect things like general undergraduate education. If you are researched focused, then teaching Algebra I becomes something of a distraction.

Exactly.


twofish-quant said:
The other problem is the Malthusian problem. Professors create Ph.D.'s that create professors that create Ph.D.'s. As long as you have expanding budgets, then that works. Once budgets stop expanding, things fall apart.

I sort-of agree with this, but there are a lot a caveats. For example, my Department does not grant PhDs. Remember, graduate training is based on the guild model: apprentice, journeyman, master. There are obvious employment differences between carpenters and physicists, but the underlying training model is the same.
 
  • #24


Andy Resnick said:
Remember, graduate training is based on the guild model: apprentice, journeyman, master. There are obvious employment differences between carpenters and physicists, but the underlying training model is the same.

I think the point is that most carpenters (masters) don't need to train apprentices to make a living. Their "master" status just means they are very good carpenters that will serve their customers well. Only some of them will take apprentices.
The problem with a professor is that it is part of its core job to have graduate students.
Professors who never had PhD students aren't "real professors". Carpenter masters who didn't have any apprentices, are nevertheless full-fledged carpenters.
 
  • #25


vanesch said:
Professors who never had PhD students aren't "real professors".

That is totally untrue, and borderline offensive.

Edit- I have more to say. You have a 'mentor' badge, comments like what you wrote are completely inappropriate.
 
Last edited:
  • #26


Andy Resnick said:
That is totally untrue, and borderline offensive.

Unfortunately questions like who is a "real" professor are determined largely by social consensus, and the belief that somehow someone that ends up teaching lots of Ph.D. students is "better" than someone that is really good at teaching Algebra I, is part of the "social truth" of academia. It may be taboo or even offensive to mention that explicitly, but when it comes time to figure out who has power and who doesn't, it really comes out what the "social truth" really is.

I suppose in an abstract sense Dean Whatshername isn't any "better" than me, even though she has landed several major NIH grants and a CV that's twenty pages long with several hundred publications in top journals, whereas my claim to academic glory is that I got good reviews from my Algebra I students at University of Phoenix. But somehow I don't think that is the "social truth" which may explain who everyone wants to get NIH grants, and teaching Algebra I is left to adjuncts.

Edit- I have more to say. You have a 'mentor' badge, comments like what you wrote are completely inappropriate.

Personally, I'd rather people explicitly state "ugly social truths" of academia than to have them unstated but implicit. Even if no one states things explicitly, you just find that stuff happens.
 
  • #27


twofish-quant said:
Personally, I'd rather people explicitly state "ugly social truths" of academia than to have them unstated but implicit. Even if no one states things explicitly, you just find that stuff happens.

I'm sorry, but what vanesch wrote was not true, much less an 'ugly social truth'. What s/he wrote is just plain ugly and demeaning. Not just to me (who is in a non-Ph.D. granting department), but also to the students.
 
  • #28


this is called knowledge century meaning that long term researches will end ...we will not see any 20 yr project from now on .

any research that don't have instant money income after its completion will be consider a some kind of luxiry.

this anticipated to start in 2000 ...

example: some IBM long term projects are canceled between the years 1999 2000 and 2005

Best
 
  • #29


Andy Resnick said:
That is totally untrue, and borderline offensive.

Edit- I have more to say. You have a 'mentor' badge, comments like what you wrote are completely inappropriate.

Sorry you took offense, there wasn't any meaning to be offensive in my writing. It was my impression, and I can stand corrected, that most university professors that want some "importance status" also want to have (several) graduate students, and that if they don't, they are, in general, *considered* as being some kind of "second rank" professors.
Maybe this is not everywhere the case, but in fact, I don't know of any full professor who didn't have, and who doesn't want, to have PhD students. Most of those I know are eager to have some. I acknowledge not to know all academic places on Earth :smile: so this is anecdotal information.

I didn't mean to say that professors who don't have graduate students are *actually* bad professors, bad teachers, or bad scientists, but rather that the system is such that part of their status is measured by them having several graduate students.
 
  • #30


twofish-quant said:
Unfortunately questions like who is a "real" professor are determined largely by social consensus, and the belief that somehow someone that ends up teaching lots of Ph.D. students is "better" than someone that is really good at teaching Algebra I, is part of the "social truth" of academia. It may be taboo or even offensive to mention that explicitly, but when it comes time to figure out who has power and who doesn't, it really comes out what the "social truth" really is.

This is indeed what I wanted to say. I didn't mean to say that people who are professors, and who like to teach first or second year courses, and who don't take PhD students, are ACTUALLY bad professors, but I had indeed the impression that it is part of the social status within academia, to have several of them.
What I wanted to stress was that this was the main difference with "carpenters" in the guild way of working: that you can have a "steady state" situation in the guild way of education, because only a small fraction of people attaining "master" status are actually expressing a desire to take on apprentices (while the majority of them will "just do their carpenter job" - as a full-fledged master) ; while in academia, "taking on apprentices" is a social necessity for "masters", which can only be sustained in an exponential growth of the field.
 
  • #31


As a current undergraduate student, I get the impression that the academic system places more prestige on professors who focus heavily on research and publishing than on professors who focus more on teaching students. I do not know how much undergraduate teaching research-oriented professors have to do (not to mention graduate students that have to TA) but I certainly do not want to be taught by people whose primary interest is publishing or researching.
 
  • #32


vanesch said:
This is indeed what I wanted to say.

Ok, fair enough. I mean, I have students in my lab now, and of course I want more.

There are as many different reasons to have students in the lab as there are faculty. But the 'social' reason is very simple to understand:

Q1) What is the research criterion for the award of tenure?
Q2) What objective evidence can you present to justify your answer to Q1?

A1) A growing and self-sustaining research program.
A2) the presence of students, paid for by research grants.

Every business owner can understand this; the owner that can build up a business is considered a successful businessman. Same for carpenters- which carpenter will have more 'social' prestige; the one who builds beds by himself, or the one who has a whole team of apprentices cranking out beds?

The point is, there is not much difference between any professional training program- they use similar training models, and have similar criteria for identifying success. All faculty start out the same- with no students.
 
  • #33


Mathnomalous said:
As a current undergraduate student, I get the impression that the academic system places more prestige on professors who focus heavily on research and publishing than on professors who focus more on teaching students. I do not know how much undergraduate teaching research-oriented professors have to do (not to mention graduate students that have to TA) but I certainly do not want to be taught by people whose primary interest is publishing or researching.

Yes and no- different institutions have different priorities. Which is why you, the student, should try and match your educational goals with that of the institution's. As a generalization, an institution that spends a lot of advertising time bragging about how much research the undergraduates get to participate in spends not much time at all *educating* the undergraduates.

Nominally, to be awarded tenure (in *any* Department, anywhere), the candidate must demonstrate "excellence in three areas: research, teaching, and service". Service (that is, serving on academic committees) usually counts the least. Of the other two, the emphasis is typically split 60/40, and each institution will decide which is more important. It's important to note that tenure is awarded by the Board of trustees, not the Department, College, or Provost.

Certainly there are abuses of the system (course releases), and I've seen lots of Junior faculty get off to a bad start due to bad advice/pressure from the Chair.
 
  • #34


Andy Resnick said:
Yes and no- different institutions have different priorities. Which is why you, the student, should try and match your educational goals with that of the institution's. As a generalization, an institution that spends a lot of advertising time bragging about how much research the undergraduates get to participate in spends not much time at all *educating* the undergraduates.

Nominally, to be awarded tenure (in *any* Department, anywhere), the candidate must demonstrate "excellence in three areas: research, teaching, and service". Service (that is, serving on academic committees) usually counts the least. Of the other two, the emphasis is typically split 60/40, and each institution will decide which is more important. It's important to note that tenure is awarded by the Board of trustees, not the Department, College, or Provost.

Certainly there are abuses of the system (course releases), and I've seen lots of Junior faculty get off to a bad start due to bad advice/pressure from the Chair.

Mr. Resnick, I agree with and understand your points. What bothers me is that research professors (and graduate students) may be required to teach undergraduates. Perhaps it would more efficient if research professors and graduate students focused strictly on research while at the same time creating a new incentive structure for professors that focused strictly on teaching.

I do not know exactly how what I mentioned ties into the education "bubble" but I feel it is a waste of money having people, whose main focus is research, teaching undergrads, whose main focus is learning.
 
  • #35


Andy Resnick said:
Ok, fair enough. I mean, I have students in my lab now, and of course I want more.

There are as many different reasons to have students in the lab as there are faculty. But the 'social' reason is very simple to understand:

Q1) What is the research criterion for the award of tenure?
Q2) What objective evidence can you present to justify your answer to Q1?

A1) A growing and self-sustaining research program.
A2) the presence of students, paid for by research grants.

exactly.

Every business owner can understand this; the owner that can build up a business is considered a successful businessman. Same for carpenters- which carpenter will have more 'social' prestige; the one who builds beds by himself, or the one who has a whole team of apprentices cranking out beds?

This is where we differ in opinion: the master carpenter can have a growing business cranking out beds, without him having the necessity to have *apprentices*. He can have lots of people employed in his business and he doesn't need to bring them to master status in order for his business to run well, for him to be socially recognized as a good carpenter and for him to make a good living.
Maybe one day he will indeed take on the task of taking on an apprentice and a journeyman, and maybe this will simply be his son or nephew or so, to prepare him to take over the business.

BTW, this is anecdotical, but I happen to know some carpenter masters which still work according to the guild principle (in France it is known as "Les Compagnons"). Well, they are pretty reluctant to take on an apprentice, because their reasoning is as follows: or the guy isn't really good, and I'm not interested in having a mediocre apprentice. Or the guy IS really good, and then he won't stay, because he will want to work on his own after a while, and not work for a boss. And then I've set up a competitor ! So indeed usually the only apprentices that get taken is through family or friends contacts. There is no "eagerness" for apprentices.

As such, the set of "masters" do not crank out many more masters in the next generation, and the system is self-sustaining. In academia, a lot more PhD are awarded than there is ever hope for them to have an academic job, and the reason for this is exactly what I said, that it is often part of an academic job's success to crank out PhDs.

The point is, there is not much difference between any professional training program- they use similar training models, and have similar criteria for identifying success. All faculty start out the same- with no students.

?
 
Last edited:

Similar threads

  • STEM Career Guidance
Replies
2
Views
1K
  • STEM Academic Advising
Replies
11
Views
645
  • STEM Career Guidance
Replies
33
Views
2K
  • STEM Academic Advising
Replies
11
Views
1K
Replies
18
Views
3K
Replies
127
Views
16K
  • STEM Career Guidance
Replies
5
Views
1K
  • STEM Career Guidance
Replies
9
Views
2K
Replies
12
Views
17K
  • STEM Career Guidance
Replies
17
Views
2K
Back
Top