One way to ground everything in reality is to think purely about the records of experiments that are stored in computer memory. Very often, that's a list of times at which events happened. If you think about
APDs, for such devices we might run a wire (or we use a fiber optic cable, or wi-fi, ...) from the APD to the computer that records the data. On that wire, there will be a voltage that most of the time will be near zero voltage, but occasionally an "avalanche" happens, the voltage goes to non-zero (1 volt, 20 volts, whatever), then the hardware checks a clock for the time and records it in memory and then to hard disk for later analysis. The hardware also resets the APD as soon as possible so another avalanche can happen. From a computing and signal analysis point of view, what's just happened was a compression: we could have recorded the voltage picosecond by picosecond to 14-bit accuracy, but we just recorded the time when there was a signal transition from zero to not-zero.
There are certainly experiments that record continuous signals (at finite accuracy and resolution, with a fixed schedule, because it's all going into digital memory), but the analyses that you'll find detailed in physics papers are often on the hunt for discrete structure of some kind, and very often a discrete structure is there to be found.
Everything so far is classical electronics (except the last sentence, which presaged what comes next here) about events and signals. There is no mention of particles or of particle properties whatsoever. Now comes the analysis, where we will introduce the idea that particles (or, more generally, "systems", a field, thing or things that are kinda classical) explain why we see the events and signals that we see. The Correspondence Principle gives us a way, called quantization, to convert a classical dynamics for some kind of classical system (mechanics or electromagnetism) into a differential equation that describes the evolution over time of a
"statevector", the Schrödinger equation. The statevector models/predicts the statistics of many different kinds of measurement results (anything that can come out of a mathematical analysis of the raw data of the previous paragraph), and, crucially, how they change over time. Some of those measurement results are "incompatible" with each other, so that properly speaking we can't talk about correlations between incompatible measurements.
The Correspondence Principle is quite tricky because it cannot be a perfect map from a classical dynamics to a quantum dynamics, but it's been a fairly decent guide for the last 90 years, so we're not going to give it up until we have something better. If we find that the quantization of a classical mechanics works well as a model for the signal analysis we do for the raw data, which has to work nicely as the statistics change over time, we pretty much say that the quantized classical system explains the raw data, except of course that we don't as much understand what we're doing when we quantize as we'd like to.
So,
@Carpe Physicum, it looks as if you might have left this conversation. If you're still here, I hope you find this a little useful even though it's definitely my idiosyncratic way of thinking about the question. I've tuned the above a little to the computing world because that seems to be your sort of thing, which wasn't hard to do, however, because that's also pretty close to my sort of thing. If you reply, I can point you to the first video on my YouTube channel. I'm trying to figure out whether you really mean Carpe, or perhaps there's a little Carping in your question?