When I was at school,
mnyah mnyah, hand calculators had not been invented, and we were taught how to more conveniently multiply and divide numbers using logarithms to base 10 so that instead of multiplicationsand divisions you did notably easier additions/subtractions of the logs - all explained here
https://en.wikipedia.org/wiki/Common_logarithm#Mantissa_and_characteristic
as maybe nowadays this is not taught? To do this you needed to get the logs, which you found in tables, an example is illustrated in the above link. (Then after adding the logs you needed to convert the resulting log back into a number it was log of. It comes back to me that although there
were 'antilog' tables for some reason we did not use them but looked them up in the same log tables and were taught a trick called 'interpolation'.) How anyone had been able to calculate the tables was not on the syllabus and we did not ask.
Now to finitely tabulate a function for any possible input it needs to have some kind of repetitive character. For example tables of trigonometric functions only need to go up to 90 degrees, tables of square roots, which we also had, need only go from 1 to 100. So, for instance, log
10 2 = 0.3010 to 4 decimal places. And that's all the table tells you. However if you need e.g. log
10(2×10
4), knowing what logs are you know that is 4.3010. If the number is less than 1 the log is negative, e.g. log
10(2× 10
-4) is -4 + 0.3010. but you didn't combine these into a single negative number, it was unnecessary. In a multiplication for instance you just added the various logs algebraically, added up the positive part after the decimal point, carrying over as in normal addition, and then added algebraically the integers before the decimal point.
I just went into this because I guess it is not taught nowadays, no longer really being useful, so maybe readers don't know it. But there was a question about it in homework help yesterday "Logarithm calculation by hand" and this stuff which I have not needed for decades came back to mind. Well, I learned it not exactly today but a very long time ago, about age 11 or 12 – even a bit depressing to be still talking about it now, doesn't feel like progress. So what's new?
Well, the part after the decimal point like .3010 above was called the '
mantissa'. Funny word, I have never heard that used in any other context and I don't suppose any of you have either. It is not obviously connected with anything else – I mean I can't think of anything less connected with all this than a mantis.
So now, TIL that
"decimal part of a logarithm," 1865, from Latin mantisa "a worthless addition, makeweight," perhaps a Gaulish word introduced into Latin via Etruscan (compare Old Irish meit, Welsh maint "size"). So called as being "additional" to the characteristic or integral part. The Latin word was used in 17c. English in the sense of "an addition of small importance to a literary work, etc."
I thought, from the above dating that the word might have been coined by Briggs himself. But the OED gives for first use 1865! It sounds then invented by some Victorian pedant. In which case its present oblivion is fully deserved.
Surprisingly the word 'mantis' seems also a 17th-century scholarly origin.
Modern Latin, from Greek mantis, used of some sort of elongated insect with long forelimbs (Theocritus), literally "one who divines, a seer, prophet," from mainesthai "be inspired," related to menos "passion, spirit," from PIE *mnyo-, suffixed form of root *men- (1) "to think," with derivatives referring to qualities and states of mind or thought (compare mania and -mancy). I find it surprising the highly remarkable insect has no older popular name.
That's what I think I learned today – but I'm not really confident it's all 100% correct.