Here was what I was able to glean from various internet sources:
Our numeral notation is, of course, Arabic in origin, and many of its conventions can be traced back to the usage of Arabic mathematicians.
The rational number $99\frac{9}{10}$ would often be notated like so:
99 9
or:
99|9.
Over time, the vertical stroke became shortened, like so:
99|9, and it was typical for the fledgling type-setting industry to substitute a comma or period instead, there being no comparable typographical character in most European alphabets.
Apparently, France and Italy were already using the "full-stop" (period) to delimit Roman numerals in such things as pagination, chapter headings and the like, so adopted the comma as the numerical decimal delimiter. This eventually became standard across most of Europe, except for the British Empire, which was already using commas to group large integers in "places of 3" such as: 123,500.
To avoid confusion with actual periods in written (and type-set) work, in the British Empire, the interpunct ("mid-dot") became common, and this was especially useful in hand-written ledger work, where a decimal point written as a period could be obscured by the rulings of the ledger.
Unfortunately for us, the interpunct (for multiplication) seems to have originated largely with Liebniz, who wrote in 1698 to John Bernoulli: "I do not like $\times$ as a symbol for multiplication, as it is easily confounded with $x$..." and later championed by his friend Christian Wolff, who was regarded by many as the leading philosopher of his day (and now regarded somewhat uncharitably as a Liebniz "follower"), and became wide-spread as an "abstract" representation of multiplication (now seen in vestigal form as the "dot-product" of vector spaces, and in place of the asterisk in some older abstract algebra books).
To see examples of the interpunct being used in American texts, one has to go back to the 1950's at least...by the time I first encountered "decimals" (around 1969 or so), that usage was already considered "archaic" (and not used in any of the textbooks I ever had, although I did see in in some engineering books my grandfather had).
So, anyway, "proper" usage and custom nowadays is not entirely "convergent": many countries use periods where Americans would use commas, and vice versa. Thus the rational number used as an example above, would be in official SI terms:
99,9
(Apparently France and England, in particular, have made it a point of honor to "do things differently", especially in mathematics...the contention over "the founder of Calculus" no doubt being one of the many reasons for this centuries-old feud. This can be seen most strikingly in Canada where the English-speaking provinces use the period as a decimal point, and Quebec uses the comma).
So...the entire situation is a bit of a muddle, and no doubt confusing to Americans who study abroad, or read foreign mathematical books, and likewise for Europeans who encounter the same difficulty with American or British oeuvres. What is "correct", therefore, depends on who and where you are.
Americans, of course, having invented the fastest computers, and best rocket-ships, like to think their word is final on the matter, and would no doubt be surprised to learn how much of the world disagrees.
Interestingly enough, much of the Arab world, having kicked off this glorious mess, now uses entirely different typography to denote numerical values, in particular, they use in some countries a (stylized) dot for "0". The tower of Babel, the sequel...coming soon to a global theater near you!