ftr said:
I think I have commented on such question before. Because we believe in science and science tells us that there is a reason for everything. randomness with no reason seems utterly illogical. Now, if it is indeed that way, people want to know why, that's all.
Well, the strength of science tells us first to be open to learn how nature behaves, and that in investigating this with using an interplay of quantiative reproducible observations and experiments and analytical and mathematical reasoning we may find that we have to give up prejudices about what we think to know. Nothing in this process is save against the possibility not to be in need for revision with new discoveries.
This happened indeed twice in the first half of the 20th century: One revision was necessary with the discovery that the electromagnetic phenomena are accurately described by Maxwell's theory of the electromagnetic field, but this theory was inconsistent with the "very sure knowledge" about the mathematical model describing spacetime (or at this time the strictly separated time and space) in terms of the Galilei-Newton model. This was such a "sacrosanct" piece of knowledge that it took about 50 years to resolve the issue finally and it involved the work of some of the most brillant mathematicians and physicists of their time (Poincare, Lorentz, Michelson, and finally Einstein): What had to be revised was not the idea of the special principle of relativity, i.e., the invariance of the physical laws under transformations from one inertial reference frame to another, but the very law of how the transformation had to be done, i.e., instead of Galileo transformations the Lorentz transformations (discovered in some predecessor form already by Voigt in the late 1880ies), implying a new spacetime model with space and time now no longer strictly separated but amalgamated into a pseudo-Euclidean affine spacetime manifold. This implies that also the laws of mechanics had to undergo a revision, and the corresponding revisions by Einstein (and corrections thereof by Planck, von Laue, and Einstein himself thereof) were after some experimental confusion resolved by experiments with the then also newly discovered electrons in gas-discharge tubes.
This turned out, however, to be a pretty harmless revision. Nothing of the considered really fundamental ideas, which you still more than 100 years later insist on, had to be revised. The physical laws still were strictly deterministic, i.e., any possible observable of any system by tacit assumption has always a determined value, and the principle of causality on the fundamental level holds in a very strong (time-local) form: Known the initial values of a complete set of observables (which simply is a set of observables all other observables are functions of) at some initial time ##t_0## their values in principle (and thus that of all possible observables, which are functions thereof) are determined at any later time ##t##.
Now this apparently save knowledge had to be revised with the advent of problems concerning phenomena becoming observable with the ever faster progress of technology. First these obstacles were considered minor issues. When Planck asked for advice what to study, a famous physics professor at the university told him, with his brillant grades from high school he shouldn't waste his time with physics, because this is all settled and the "little clouds" on the horizon would be simply resolved by measuring some fundamental constants of ever better precision and small revisions of the laws of classical physics (at the time consisting of classical (still Newtonian) mechanics, Maxwell electrodynamics, and (phenomenological) thermodynamics).
One of the clouds was not so new to begin with: It was the question of the absolute value of entropy and a theoretical understanding of what's now known as the Gibbs phenomenon in statistical physics, which however was already met with quite some skepticism by the more conservative physicists of the time, since they didn't even believe in the existence of the atomistic structure of matter to begin with. With the advent of low-temperature physics (with one milestone being Kammerlingh-Onnes achievement of liquifying Helium in the early 1900s) the issue became very evident: The specific heat of most substances did not behave as expected at low temperatures. Also when it became clear that metals had conduction electrons, but these didn't contribute to the specific heat even at room temperatures, another "cloud" arised at the horizon.
Then there was the problem of "black-body radiation", which was not describable with classical electrodynamics and thermodynamics/statistical physics. Famously this was how the quantum revolution started: When Rubens and Kurlbaum accurately measured the black-body radiation spectrum as a function of temperature, which was first an attempt to provide a better standard for measuring the efficiency of lightning sources (gas and the new electric light bulbs), lead Planck to guess the right law and subsequently with a brillant (but first not really understood) statistical analysis which only worked when assuming that each frequency mode of the electromagnetic field could exchange energy only in lumps of ##h \nu=\hbar \omega##. Already this was too much for Planck, but he couldn't find any other way to derive the accurate black-body law, named after him.
The rest of the story is well known. Einstein came with his light-quanta idea in 1905, then Bohr (completed by Sommerfeld) with his ad-hoc idea to explain the hydrogen spectrum from the atomic model enforced on the physics community by Rutherford's findings. Then the very stability of matter, the precise indistinguishability of atoms of each kind has become an enigma for classical physics. Then the ad hoc solution for hydrogen was found to be flawed, because it only worked for hydrogen. Another happy incident has it that it also works for the harmonic oscillator and thus lattice vibrations of solids, which resolved the problem with specific heat but not yet the question, why the conduction electrons in metals do not provide anything to the specific heat, while the model to describe the conduction electrons as a quasi free gas moving in a positively charged background worked very well in explaining Wiedemann-Franz's law of propotionality of electric and heat conductivity.
The up to day "final" resolution was modern quantum mechanics, discovered more or less independently in parallel no less than 3 times by (a) Heisenberg, Born, and Jordan (with some important help by Pauli) in terms of "matrix mechanics", (b) Schrödinger in terms of "wave mechanics", and (c) Dirac in terms of "transformation theory". Very early it was clear that these are just different mathematical expressions of the (so far one and only) full modern quantum theory. Even the idea of Jordan's not only to "quantize" the mechanics of particles but also the electromagnetic field (an idea that had to be rediscovered a few years later by Dirac since it was abandoned first as "overdoing the quantum revolution somwhat") turned out to be necessary to get the correct kinetic explanation of the Planck Law a la Einstein (1917) with the necessity to have not only absorption and induced emission but also spontaneous emission of "light quanta" also within the new theory, and up to today nobody has come with anything better.
Then, indeed there was a unique new issue, namely that of "interpretation", and this was solved (at least in my and that of most physicists' opinion solved) also very early by Born in a footnote in his paper describing the important wave-mechanical treatment of particle scattering: Schrödinger's wave function had to be interpreted probabilistically, ie.., not as a classical field describing a single electron, but as "probability amplitude" for finding the electron at a given position.
The theory thus turned to be perfectly causal, i.e., the quantum states, described by wave functions (and more generally by statstical operators) evolve according to a causal law (e.g., as given by the Schrödinger equation for the wave function), but the meaning of this state description is completely probabilistic, implying that observables like position and momentum (and other observables like energy and angular momentum) were in general not determined but only probabilities could be predicted which value will bemeasured given a state in terms of some preparation procedure, determining the wave function at some initial time (which implies how it has to look at a later time by solving the Schrödinger equation).
In my opinion, after all the decades of hard tests of this conjecture of "irreducible randomness" in the behavior of nature, including some of the most "weird-looking" implications (entanglement), it looks as if nature is indeed "random/indeterministic" on a fundamental level.