Right you are!!!! the representations become infinite, i'm glad i said AT LEAST 9, I didn't even consider fractions with common factors. These other representations certainly make reducing fractions a more difficult nightmare.
Depends on what people you look at and what they were working on. One can say that the most famous problem in math history is getting better decimal approximations to pi so decimal expansions are ancient. The decimal expansion of a rational is easy so once you have that method you can concentrate on other things like getting decimal approximations to irrational numbers like extracting roots, babylonians tried that, archimedes used 2 regular polygons, one inscribed, one circumscribed, both 96 sides and got pi accurate to 3.14, even the chinese approximation using the well known fraction as an approximation to pi of 6 digits. How did they know one fraction is a better approximation than another if they did not get the decimal expansion of both fractions and compare to the KNOWN value of the decimal expansion of pi in their time?
I am only giving a few examples but i am aware of hundreds more cases where decimal expansion and decimal approximations have been very important throughout history so i don't understand your claim that decimal representations are a more recent phenomenon.
Personally i see it as suspect but i don't have a problem if you are not suspicious of an infinite number of representations for the same UNIQUE value. Because i can see it your way... all those representations are proved equal. However...