Forgeting? I think not. If I recall, Schrodinger, on his claim, was working through his confusion about QM re: Born, Heisenberg, Bohr, et al. What is important is that we know a lot more about everything than did our QM founding fathers -- and I'm thinking statistics and probabilty and neural science in particular. Goodness, any part of physics, for example, that deals with vectors, finite or infinite in dimension, involves superposition -- so we started our physics life with superposition in high school or freshman physics -- rowing boats across rivers with currents; circularly polarized light, ...
In regard to probability and statistics, prior to the advent of today's powerful computers, we were severely constrained in our ability to do any bit the most modest-sized problems. Prior to 10-15 years ago stat and probability were (necessarily) quite theoretical, and quite hard to understand. And, of course, that situation was all the more pronounced in the 1920s and 1930s -- so to speak, they didn't know from.
During the 1970s and 1980s folks like me, Ph.D s in quantitative fields, dominated the high end of business consulting -- we used old mainframes, and, my favorite, VAXs and so on. Now we have MBAs and BAs able to do much of what I and my colleagues used to do, simply as a result of amazing computing power, and remarkable software, often now called Data Mining software put out by outfits like COGNOS, now IBM, Microsoft, ... (I and my colleagues have been, to use a phrase I don't like much, outmoded.) Schwinger complained that Feyman had brough field theory to the masses. Us 65 and older PhDs complain that computers and software have brought statistical analysis to the masses. (I'm getting there, and please excuse my sloppy grammer, "Us ...)
I find that many younger analysts, some trained in physics, have virtually no issues with probability or statistical interpretation, that's just stuff you do. Think, for example about Newton's days -- calculus was new, controversial with some, and created a long examination of the idea of infitesimals -- some still worry about these matters. But most of us just go ahead and do it. In regard to interpretive history, many notions have met strong resistance on the part of some: calculus, transfinite calculus, the standard, working-physicist's interpretion of QM. The historical trend is very clear: resistance to successful new ideas diminishes slowly, but it does indeed diminish.
Those of us who have worked with probability and statistics to do practical problems do not have any interprative problems -- a factor analysis is a factor analysis, even, God forbid, if we don't have normal distribution; regression is regression, Type A and Type B errors are just that. When you use this stuff (I'll find a better word someday) you get familiar, and, to put it bluntly, you get past the beginners stage fairly quickly. And, with all due respect, many physicists are beginners -- they may know theory, measures, and convergence in the mean, .. -- but they typically do not have any experience with the great realm of statistical techniques.
As we know as teachers and students, homework is essential.
When you have done a hundred regression models, hundreds of survey analyses, a hundred sales forecasts, ...you pick up a very practical approach to what you are doing -- which includes writing and delivering reports which must be intellegable to managers who know little or nothing about statistics, but know a lot about their business. My claim simply is: if theoreticians spent a couple of years doing basic statistical work for businesses or other organizations, there would be virtually no controversy over the interpretaion of QM. Again, with all due respect, the physics community is largely amateurish in its use and thinking about probabiltiy and statistics -- I'm far from the only one who has said this./
You have to do it do understand it.
There's something us old guys call the "JCL trick" JCK, IBMs Job Control Langauge was the way you got your programs running on mainframes. JCL was job protection for system programmers; JCL was tricky, went on forever, and hard to learn. Thus the first thing with a new job, was to ask "Who's the JCL guy?" IBM and the programmers made a Faustian bargain to keep things purposefully complicated.
Sometimes I think the JCL trick is, in effect, a major reason why the QM interpretive controversy continues. That is, job protection. If the controversy is resolved, that's one less area in which to work. Plus, for some, the controversy is fun, great entertainment, and goes on forever. Otherwise, I cannot understand why the physics community, for the most part, makes the interpretation of QM so complicated. During my grad student days, my professors just hammered on simpliciy, don;t crack a hardboiled egg with a pile driver (Goldstein, Classical Mechanics). In my view the simple and correct way to interpret QM is as a theory about probability, and deal with the interpretation as if you are, say, handicapping a horse race, choosing stocks, or wondering when your teenager will get home on Friday night.
A bet I'll not live to win is that in 20 or 30 years, the debate about QM and its interpretation will be a thing of the past, and my side will win. History bears me out: anybody know anyone who's into prime movers, or the divine right of Kings, ...
Throughout history, pragmatism ultimately prevails. (My reference here is virtually any history book, and particularly, Daniel Boorstein's magnificent "The Discoverers", a history of science and technology and exploration, covering pre-history to the present, beautifully written, nicely illustrated, and in a subtle way, very profound. You can see some of why I think the way I do in this book.
Again, my response went out of control again -- but then , nobody really needs to read this.
One of my favorite examples of changing intellectual frameworks is just starting to happen. That is, physicists like the brilliant Roger Penrose just love to make the workings of the human mind's consciousness exceedingly complex, invoking quantum gravity, and ... (Job protection?)
Then there are physicists like Sir Francis Crick of DNA fame who think quite the opposite. (See The Astonishing (Amazing) Hypothesis) He takes the view that consciousness is simply an aggregate of brain function. That is, as neuroscientists pursue the workings of the brain, we will understand consciousness as a consequence of the neural processing of perceptual signals from both outside and inside. (I like the idea of virtual hommunculii as an aid, a metaphor for how we see or hear or think. I believe this is similar to work that Marvin Minsky publisjed some years ago; his magnum opus on the mind -- The Society of Mind is the title.)