russ_watters said:
How can there be "progress" if there is nothing left to learn?
You are jumping to completely new questions every time. Let's look at the post history:
I made a post demonstrating that there is more progress than ever before
"But that's progress, not what is left to learn"
I made another post pointing out that, indeed, this previous post was about progress of science.
And how exactly does that lead to your new question, which starts with a wrong premise anyway? The fact that there is progress clearly shows that there is something to learn.
russ_watters said:
Yes, really. It's absurd to suggest all funding agencies and people setting their budgets would be completely stupid. Especially as we clearly see the results of their funding.
russ_watters said:
Technological progress may be unlocked by certain discoveries, but it can take years, decades or centuries for such technologies to mature and fully develop
[...]
If the pace of advancement is accelerating, there should be identifiable technologies invented decades ago that should profoundly change the world in the next few years. What are they?
There is no minimal waiting period. A new revolutionary technology next year that uses an effect discovered last year wouldn't be any evidence of science slowing down. It's the opposite, in fact. Sudden profound changes of the world are very rare and hard to predict. If you are fine with slower changes, they are all around us, especially in the technology sector.
russ_watters said:
The Laws of the Universe exist. E.G., eventually we will learn how the universe actually works, and after that, there will be nothing else for Science to do but add decimal places to measurements.
Knowing the fundamental laws of the universe does not mean knowing everything. We know all the fundamental physics that's relevant e.g. in a brain. But that doesn't mean we fully understand the brain. We barely started understanding how it works. We also know all the fundamental physics that's relevant in high temperature superconductors. But we don't know why they are actually superconducting. We can't even predict the mass spectrum of all the hadrons we know, relatively simple composite particles following laws we know very well. And so on.
Even if someone magically hands us the fundamental laws of the universe tomorrow we wouldn't be at the end of science. Most scientists wouldn't even see an impact on their work. Some particle physicists would need new jobs after verifying the laws (assuming we can, with our hardware), others would continue studying hadrons just like before.
russ_watters said:
Are you literally saying you believe that The Laws of the Universe are infinite in extent and as a result we'll never learn them all, no matter how much we learn?
I never said anything like that, neither literally nor figuratively. I don't know where you get that from.
I said we are far away from any limits of the total amount of knowledge we can collect.
russ_watters said:
If there is an end, then predictions of when it will occur won't necessarily be correct, but predictions that we are getting closer to it are always true.
Trivially, yes, but where is the point? That doesn't make a daily prediction "we are close to it, this time for real!" useful.
We'll notice when we are getting anywhere close to limits. More and more research will not lead to new open questions that can be addressed. Ultimately such a limit might not come from things that can be studied, but things that can be studied within the limits of a human brain. Unless computers expand that limit. Anyway, not the point. We don't see any indication of such a limit even for humans.
russ_watters said:
You're being disingenuous here. It means, exactly, that THIS aspect of science was declared to be complete in 1983. Prior to 1983 it was improving asymptotically toward zero error and at 1983 it was declared to be complete and perfect/exact.
There was nothing discovered in 1983 (about the speed of light, obviously). We didn't learn anything new about the speed of light in 1983. We just chose to define the meter based on a transition wavelength of a cesium isotope instead of a transition wavelength of a krypton isotope. It's the same cesium isotope that also defines our second, which gives the speed of light an exact value.
russ_watters said:
It wasn't an arbitrary decision to define it to be exact
It was exactly that, an arbitrary decision. It had good motivations, of course, but ultimately the choice was arbitrary. We could also have switched the definition of a second to the krypton isotope, leading to a different value for the speed of light. Or just kept the 1960 definition. It's all arbitrary. In the future we might switch the definition to optical clocks. We'll almost certainly decide to keep the value the same for convenience, but we wouldn't have to.
russ_watters said:
Again, you seem to be putting your emphasis behind the effort, not the result.
Overcoming the effort is proof of the progress. You need to learn more and at a faster rate to keep up with the trend, and we do that.
russ_watters said:
But I can't help but wonder if you have that existential question poking the back of your brain that the importance of the next order of magnitude of precision isn't defined by its difficulty but its utility.
GPS?
NMR?
MEG?
MFI?
Time of flight PET? Gravimeters to look for underground resources? All the result of improving the precision of something by many orders of magnitude until it became useful. And that list is far from complete.
Don't worry about me,
I see the results of research everywhere.