Welcome to PF Loki180,
There's quite a few things wrong with what you're saying. First of all, Edwin Hubble did not determine that the age of the universe was 14 billion years. It's true that he discovered that it is expanding, and that he got an estimate for the value of the constant that we now call the Hubble constant. However, the age of the universe depends on this constant and on other cosmological parameters in our mathematical models of the universe, and for a long time (most of the 20th century) there was simply too much *uncertainty* in the values of these parameters for us to be able to say so definitely what the age of the universe was. We certainly didn't know it to 10% precision or better like we do now, and I don't think it would be an exaggeration to say that people really couldn't say definitively whether it was 5 billion years or 20 billion years. In fact, for a long time, it was a problem with the prevailing/favoured cosmological model that it predicted an age of the universe that was
younger than the ages we estimated for the oldest stars we could observe (e.g. stars in globular clusters). It wasn't until the late 90s and early 2000s, thanks to a number of pioneering ground-based and sub-orbital experiments to measure fluctuations in the Cosmic Microwave Background (CMB) radiation (e.g. BOOMERANG), that we were able to determine the values of these parameters with any precision. These early results ushered in the era of precision cosmology, and were later confirmed to exquisite precision using telescopes such as the WMAP satellite (launched 2001), and now the Planck satellite (launched 2009), along with non-CMB data, all thanks to which we now have the presently-accepted value of 13.7 billion years. Just a decade before WMAP, we did not think we would ever be able to get such precise answers to questions like "what is the age, geometry, and mass-energy content of the universe?"
Another problem is your conception of distances. 3 megaparsecs (Mpc) is 3 million parsecs, or approximately 10 million light years, which is a
tremendously large distance. It is
much larger than the size of our Milky Way galaxy, which is only about 100,000 light years, or around 30 kiloparsecs (kpc) across. So it makes absolutely no sense to talk about a star in our galaxy that is 3 Mpc away, because that distance puts us
well outside of our galaxy. Once you get in the Mpc range, you're talking about the scales of galaxy clusters, rather than individual galaxies. Note that stars within our own galaxy are not expanding away from us, because our galaxy is a gravitationally-bound object, so the local effects of gravitation dominate over expansion. This is true of individual clusters of galaxies as well. The individual galaxies within a galaxy cluster are gravitationally-bound to each other and hence do not expand away from each other. However, once you get to distances on the scale of the distances
between galaxy clusters (like, maybe 10 Mpc or larger), then you find that these individual clusters are moving away from each other in a manner described by Hubble's law. I'm not going to deal with Imperial units (that would just be silly

), but using a sort of fairly standard value for the Hubble constant of 70 (km/s)/Mpc, you'd find that the nearest galaxy clusters, which are of order 10 Mpc away from us, would be receding away from us a at speed of [70 (km/s)/Mpc]*[10 Mpc] = 700 km/s.