Perhaps I am missing something obvious, but here it is anyway. Observation of an astronomically distant object is essentially observation of a past condition of the object, due to the fixed speed of light. And IIRC, the Hubble constant seems to say that all sufficiently distant objects recede from the observer (any observer, supposedly anywhere in the universe) at a rate of about 77.3 Km/sec per mega-parsec of distance, or 77.3 Km/sec per ~3.28 million light-years. So if we look at a galaxy 3.28 million light-years distant, we see it receding at 77.3 Km/second. But what we are really seeing is that it was receding at 77.3 Km/second 3.28 million years ago. If we see a galaxy many times that distance receding many times faster, we are (of course) really seeing it much earlier in time. Although Hubble's constant is a d(v)/d(s) value, acceleration is not defined as d(v)/d(s), it is defined as d(v)/d(t). If the observed recession velocities of galaxies are plotted as a function of time, not distance, the galaxies were decreasing their recession velocities as time progressed. What does that say about the expansion of the universe "accelerating?"