# Problem switching back and forth between the diffrerent arithmetics

1. May 23, 2013

### Tyrion101

I do really well if I have one form of arithmetic like addition, or subtraction. But as we all know math doesn't work like that sometimes. I will do silly things like 10/2 = 8, or 10*5 = 3. If you can understand what I am saying. I mix up my tables when I'm dealing with a long problem, and I was wondering if there was any trick to avoiding something like this when you're doing a long complicated problem?

2. May 23, 2013

### Staff: Mentor

The best strategy is to identify exactly when you make this mistake under what operations and with what numbers.

Are you confused about operator precedence like multiplication and division before addition and subtraction and parentheses have higher precedence over arithmetic operations?

I used to have difficulty with 9 x 6 vs 8 x 7 and by reinforcing the 9 x 6 answer and remembering that the sum of the digits would also add upto 9 then I was able to master it.

You need to watch yourself doing the problem and develop a style. As an example, do the algebra first then plug in the numbers to the simplified equation and solve.

You need to drill yourself by redoing the problems that gave you trouble several more times to be sure you don't make the same mistake.

3. May 24, 2013

### Tyrion101

I've been working on linear inequalities but for some reason no matter how hard I work I just hit a wall is there any common mistakes that I can watch out for? Also how do I know my answer is right, with equations there is just one answer.

4. May 24, 2013

### Staff: Mentor

Do you know the "times table" up to, say 10 * 10 or 12 * 12? The mistakes you're making suggest to me that maybe you don't. When I was a teacher, there were a few teachers who derided the idea that students should have a good grasp on basic arithmetic, by saying that the students could just use calculators.

IMO, those teachers should be sued for malpractice.

What do you mean "mix up my tables"?

5. May 24, 2013

### Tyrion101

So I don't know my tables because I mix them up when I have a long problem to do? I can do short ones in my head. If you read the actual post I gave an example.

6. May 24, 2013

### Staff: Mentor

there's a nintendo ds game that can help with strengthening your recall of these tables:

https://www.amazon.com/Personal-Trainer-Math-Nintendo-DS/dp/B001LNYM90

The basic idea is to fill in the table where the the row and columns are scrambled and then allow you to fill in the squares. Its known as the Kageyama hundred-cell method.

I wasn't able to find a web version of it but did find a wikipedia reference for it:

http://en.wikipedia.org/wiki/Profes...Training:_The_Hundred_Cell_Calculation_Method

Last edited by a moderator: May 6, 2017
7. May 25, 2013

### Tyrion101

Thank you to those that tried to help, when I am fully caught up with all of my classwork I will have to try out the cell method thing. Until then, I've decided, calculator, and being very careful to check the numbers I write down vs the numbers that the calculator, and the problem on the screen gave me.

8. May 26, 2013

### symbolipoint

Memorize the basic multiplication facts up through 10 or 12. I too still forget a few of them but I just think carefully and fill in any simple forgotten fact since I know how multiplication works. 8 x 6 ? OH! I forget.... 8 x 3 is 24, so 8 x 6 is just twice that, so just double 24 to get 48. 9 x 8 ? Ohhh! Forgot. Well, I know 9 x 4 is 36, and if needed, I can add 36 to 36 because this should be 9 x 4 twice, so it is 72.

9. May 29, 2013

### Staff: Mentor

I would strongly advise memorizing the multiplication facts (as symbolipoint recommends, below) as soon as possible over relying on a calculator. It's very easy to enter a number incorrectly, which is a guarantee of a wrong answer. Also, not knowing how to do simple multiplication makes it impossible to get a rough estimate of what an answer should be.

Last edited: Jun 1, 2013
10. Jun 1, 2013

### lurflurf

Whether students should have a good grasp on basic arithmetic depends on what is meant by good, grasp, basic, and arithmetic. It is clear is that doing hand calculations are an error prone waste of time. Anyone who thinks otherwise should do a thousand divisions with ten digit numbers by hand then and score their time and accuracy. It is also amusing that hand calculation fans always assume that hand calculations never error, when they error often. They are also oddly particular in there examples. Things like 45*19 and such. Where are hand calculation fans when serious calculating is to be done? Some day some one will factor RSA-1024 = 135066410865995223349603216278805969938881475605667027524485143851526510604859533833940287150571909441798207282164471551373680419703964191743046496589274256239341020864383202110372958725762358509643110564073501508187510676594629205563685529475213500852879416377328533906109750544334999811150056977236890927563

I believe that person(s) will be machine assisted, but go ahead hand calculation fans, prove me wrong.

11. Jun 1, 2013

### Staff: Mentor

There's no need to parse the words individually. "Grasp" is pretty much self-explanatory. "Basic arithmetic" is commonly understood to consist of addition, subtraction, multiplication, and division. By "good grasp of basic arithmetic" what I had in mind is being able to add any two single-digit numbers and to know the times table up to 10 * 10 or 12 * 12.
They might be error prone for some, but I disagree that they are a waste of time. People who are unable to do simple arithmetic as described above, are likewise incapable of recognizing when they have entered a number into a calculator incorrectly, by being unable to do a quick order-of-magnitude check on their work.

This is well beyond what anyone would consider basic arithmetic.
Who assumes this? People make mistakes in everything they do. This is the reason for double-checking (or even triple-checking) your work.
And back in 1991 when the Pentium processor came out, how was it that someone discovered that some division operations were incorrect in the 5th and beyond decimal places? If your device is giving incorrect results, how else to tell this if you don't know how to do the computation by hand?
And if they are totally dependent on that machine, and it breaks, the battery goes dead, or they lose it, what then? That person will be unable to do anything. A person who knows how to add, subtract, multiply, and divide, is still able to perform a wide range of computations.

12. Jun 1, 2013

### SteamKing

Staff Emeritus
It's a sad fact of modern mathematical pedagogy that insufficient attention is paid to ensuring that students learn and become familiar with basic arithmetic operations and numerical facts like addition and multiplication. I think this is why the OP seems confused that there are two arithmetics: one for addition, and another for multiplication. In my schooling, back when giants roamed the earth, we were introduced to the multiplication table by the process of counting by twos, threes, etc. After this was drilled into us sufficiently, it was only natural to proceed to constructing and understanding the multiplication table.

Sadly, in only a couple of generations, you now see heated and bitter debates within the educational community about the need for students to practice in becoming competent at arithmetic and whether the basic algorithms (long division, manipulation of fractions, etc.) should be taught at all. Just throw the kids a calculator and don't bother the teacher with a lot of questions about math.

13. Jun 2, 2013

### lurflurf

I do not think your "good grasp of basic arithmetic" is commonly understood. One digit multiplies and two digit adds should be within most peoples ability, but that's a pretty limited toolbox. If it is fine to reach for a calculator for 45*19 I don't see why it is so bad to reach for one for 9*7, or to think 9*7=(8+1)(8-1)=8^2-1^2=64-1=63. It might take a few tenths of a second longer, but it demonstrates understanding that rote memorization does not. And again order-of-magnitude and other checks can be used with a calculator, in fact it is easier to check for errors. For example a pentium chip user could catch the error in the well know example 4195835 / 3145727 by computing (4195835 / 3145727) * 3145727 - 4195835.

If a hand calculation is impossible it cannot be used to compute or check a result. Machine calculations are used to check machines. A person equipped with a twenty year old computer and ignorant of its well known flaws is still better off than William Shanks. You have admitted 10 digit division is impractical for hand calculation, so even with some lost accuracy we are better of with a pentium, even more so if we know about the error and avoid it.
Pilots, doctors, carpenters, and others are totally dependent on having appropriate tools. They are wise to maintain them and have spares available. Most important is the fact that their tools allow them to do things they could never do without them. Some amount of practice for equipment failure might be wise, but only a small amount.

14. Jun 2, 2013

### Staff: Mentor

They should be within the abilities of most, but in the case of the OP, I don't think they are. As SteamKing reported, there is a big controversy in the math education establishment about whether these simple concepts of basic arithmetic are important. One side in this controversy uses bumper sticker phrases such as "drill and kill" to downplay the importance of having a basic set of arithmetic skills on hand (i.e., memorized).

It seems to me that only in education do we find this silly argument. If you look at other endeavors, such as sports or music, it is well understood that you have to put in many hours of practice before you can get to a level of competence, let alone a mastery of the sport or instrument. The hours spent at practice are analogous to the hours spent mastering arithmetic facts.
I don't have any problem with someone using a calculator to multiply 45 and 19, provided that this person would be able to do this problem using only paper and pencil. As far as writing 9*7 as (8+1)*(8-1), that's a creative way to do things, but a person who doesn't know the multiplication table would likely be as stumped by 8*8 as by 9*7, so I'm not sure that's a valid argument.
Agreed, it is much easier, but that's not my point. As I already mentioned in an earlier post, a person who is completely dependent on a calculator for all arithmetic problems will be completely unable to function if the calculator is lost, stolen, broken, or otherwise unavailable. A person with some basic skills in arithmetic will take more time, but has a chance of completing the problem.
Who checks the machines that are checking the machines? The quote attributed to the poet Juvenal comes to mind: "Quis custodiet ipsos custodes?" Who watches the watchmen? What if your calculator happens to be using the same flawed chip?

As far as the well-known example you cited (of which I am very well aware, having written an article that was published in a popular computer magazine at the time), any person who is competent at long division could carry out that division. It would take some time, but it's doable.
But I didn't say it was impossible.
But at the time the error was discovered, most people didn't know about the problem, and therefore couldn't avoid it. By the time the problem was discovered, Intel had shipped a large number of Pentium processors. The recall cost Intel somewhere in the neighborhood of $1,000,000,000. We're not talking about pilots, doctors, or carpenters here. It's safe to assume that pilots and doctors have many years of academic training, from which we can be reasonably sure that they are compentent at arithmetic. We can also assume that carpenters and others in the trades have received training in their areas. Again, I'm not concerned about these people - I'm concerned about kids in the primary grades who manage to get through the first six years of school without becoming competent at ordinary arithmetic. In my view, this is an indictment on our (US) education system. The so-called "educators" who push the nonsense that knowledge of arithmetic is unimportant are deeply misguided, IMO. 15. Jun 2, 2013 ### lurflurf ^The so-called "educators" who push the nonsense that knowledge of arithmetic is so important that it should be practiced to the exclusion of other things are deeply misguided, IMO. The main thing a cost benefit analysis. Being rather moderate on the issue I believe that most children should practice arithmetic for a few hundred hours between the ages of seven and eleven or so. This is more than enough for practical purposes and more time spent would be wasted. If this practice is missed for some reason, it is may be done later. In this time a few will (for reasons such as disability) not have much skill, but should move on anyway. Many like myself who do not "know the times tables" do not need to. It is quite rare that I do a thousand one digit multiplies in a day, so bringing my time down from say five minutes to two is a waste of time. I am not "stumped" by any multiplication facts I do not know, as I can figure them out as needed. In fact if I did "know" them my speed would not improve much as I would still check mentally for correctness. Those that "know them" do things like 5*7=55 all the time. Even so most errors are in multiplies with more digits anyway. I would say practicing hand calculations is more like practicing air guitar than guitar. It is pointless even if you become quite good. When checking calculations, they should be checked in a different way. This is true of hand and machine calculation. My example of (4195835 / 3145727) * 3145727 - 4195835 reveals the pentium chip error, a different chip is not needed. I computed 4195835 / 3145727 by hand today to ten digits, it took me half an hour and my answer was off by more than the pentium. My error checking found and corrected the error in ten minutes, but still the pentium is better, as flawed and old as it is. Hand calculation fans love to add and multiply (maybe because they are easy). They do not seem to like solving equations or computing special functions. If it is alright to do those things with a machine why not add and multiply too? 16. Jun 2, 2013 ### lurflurf I just remembered Asimov on Numbers by Isaac Asimov has a humorous take on useless arithmetic, in particular you may be interested in the now public domain A new and complete system of arithmetic, composed for the use of the citizens of the United States by Nicolas Pike which is chock full of useless arithmetic. I just think given limited time it is better to learn a little arithmetic and many other things rather than lots of arithmetic. http://archive.org/details/newcompletesyste00pikerich 17. Jun 3, 2013 ### Tobias Funke The difference between 45*19 and 9*7 is that the latter is made up of single digit factors. It's not an arbitrary distinction between "big" and "small" numbers; it's the crux of the matter. Memorizing the single digit multiplication table lets you theoretically do any multiplication without any further memorization. That's the whole beauty of the multiplication algorithm, and it's a shame that "knowing arithmetic" is interpreted as simply memorizing tables and not understanding the idea behind the algorithm, or that arithmetic itself is not though of as "real math". What a perfect, age-appropriate example of mathematical thinking we have in the arithmetic algorithms (whatever variant you choose) that just goes to waste. By the way, I don't have every single entry in the table memorized, especially the sevens for some reason. But I can do something like symbolipoint described and still have my times tables written almost as fast as my hand will go. It doesn't have to be instant recall, but not being able to do 12*4 within 5 seconds without a calculator is pretty bad. Despite mathematicians wildly exaggerating by claiming that they can't multiply or figure out a tip, I think they're almost all pretty good at arithmetic and severely underestimate just how bad a lot of other people are, especially when it comes to fractions. Also, your choice of nine times seven is quite a coincidence, since Asimov also wrote "The Feeling of Power", about a future where nobody can do any arithmetic without the help of machines. The rediscovery of this ability isn't exactly beneficial, though. 18. Jun 3, 2013 ### Mark44 ### Staff: Mentor Of course you are entitled to your opinion. Whether mastery of arithmetic is less important that "other things" depends on what those other things are. This is yet again another straw man argument. I never once advocated for doing a thousand single-digit multiplies. Kind of belies their claim to knowing them, doesn't it? Not surprising in light of your admission that you don't know the times table. As I mentioned before, Intel issued a recall of those chips, so it is very unlikely that there are very many of the flawed chips still out there. ??? Based on what evidence? In any case, the point of this thread was a minimum level of competence at arithmetic, which is limited to the operations of addition, subtraction, multiplication, and division of numbers. In my 21 years of teaching experience, people who had trouble with arithmetic had even more difficulties in algebra and trig, not to mention other areas of mathematics. 19. Jun 4, 2013 ### lurflurf ^It is not opinion, hand arithmetic is less important than algebra, calculus, geometry, logic, and statistics. People use those things for many productive purposes. There is a use for an expert in nonassociative ring theory say, there are no opportunities for an expert in hand arithmetic, I cannot name even one. If you do not advocate doing thousands of single-digit multiplies, it weakens your case. Repetition is needed to develop the skill you value, and the skill is useless if it not used. If one does not multiply much, a second is fast enough. Why waste time getting faster? You do not need to "know" the times tables. Your "drill and kill" movement devalues understanding, which I object to philosophically, but as a practical matter understanding is a safety net, rote learning leads to strange mistakes. Don't back off your pentium chip now, it was your best point despite the fact that the mistake is easily avoided by either using appropriate error checking or using a chip less than twenty years old. The reason it was even a story is it was such a surprising error, hand calculation errors do not make the newspaper. As to my own error it had nothing to do with times tables, at the 10501080-9437181=1063899 step I got 1073619. You did not even credit me for catching it, too bad I was over confident, if I checked as I went instead of at the end I would not have propagated the error. You have still not told me how to avoid being a modern day William Shanks should I take up hand calculation. The evidence is that all the hand calculation fan rants are about long division and addition and such. Please link to some rants that encourage more complex hand calculation if you know any. I see no reason that dividing by hand is noble and taking square roots is not. Just use a calculator for both. You have not commented on my above linked A new and complete system of arithmetic, composed for the use of the citizens of the United States by Nicolas Pike. Back then basic arithmetic was useful (and more inclusive). Your experiences may have biased you; I have known many people including scientists, tenured mathematics professors, and disabled people; who who had trouble with arithmetic and few difficulties in algebra and trig, not to mention other areas of mathematics. Much older research on the topic had flawed methodologies. Subjects with arithmetic difficulties were either not instructed or not tested in other areas of mathematics. The OP cites 10/2 = 8, or 10*5 = 3 as typical mistakes. I do not know what causes those mistakes, but they could well be logic errors rather than arithmetic. Last edited: Jun 4, 2013 20. Jun 4, 2013 ### Mark44 ### Staff: Mentor I never claimed that arithmetic was more important than algebra, etc. You said it was less important than "other things", and I said it depended on what those other things are. You seem to throw up a great many "straw man" arguments, with objections to points I didn't make. I did not advocate doing thousands of single-digit multiplications. What I am saying, since you seem to have difficulty understanding my point, is that it is important, in my opinion and that of many others, for students in the primary grades to know how to do arithmetic. This entails, in part, being able to add, subtract, multiply, and divide numbers of a reasonable size - without the use of calculator or other computing device. I didn't define "reasonable size" but it doesn't include 1024-bit numbers. It probably would include numbers with 10 or fewer digits. Another straw man. I am not advocating for the memorization of, say, the arithmetic facts and the times table at the expense of understanding. What I'm saying is that mastery of the basic operations of arithmetic is the foundation on which much of the more advanced areas of mathematics depends. Just as when a house is built, if the foundation is weak, the house won't last as long. As I mentioned earlier, I was a teacher for 21 years. Toward the end of that time I started hearing the same arguments you are making, including a movement in Portland, OR, to issue calculators to kindergarteners. It amazes me that some of these "educators" are not able to draw parallels in other life endeavors such as sports and music, to name just a couple. To excel in these areas requires a lot of practice of basic operations, even for those who have a natural talent for these pursuits. Someone who has to reason through how to catch or hit a ball (in baseball), or grab a B7 chord (guitar) is not likely to become accomplished in that endeavor. These skills need to be ingrained in "muscle memory" just like 6*8 needs to be in memory, so that the brain can take on more complicated tasks. I'm not backing off what I said about the Pentium chip, and don't know why you think I was. As far as the problem being easily avoided, I don't think so. The Pentium chips that had the problem didn't consistently give incorrect answers. For the vast majority of possible divisions, they produced the correct answer. If a device gives consistently wrong answers, it doesn't take long for someone to realize that they are erroneous. However, if the incorrect answers come infrequently, it's much more difficult to notice them. Does your appropriate error checking include checking every single floating point arithmetic operation performed by the chip? That's costly in terms of performance. What's your point? I have no problem with people using a calculator to perform arithmetic operations, provided that they are able to do them by hand, for those times when a calculator is not available. On the other hand, all of those students who are the beneficiaries of the "enlightened" pedagogy that you favor, and who can't do arithmetic, will be dead in the water. As far as taking square roots, many of us learned how to do this with paper and pencil, and I still remember it, even though it's been a good long while. If it comes down to it, I can calculate, using only paper and pencil, the square root of a number to any desired precision, something the vast majority of calculators can't do. Of course, if I need to do a square root, I reach for a calculator. It's still useful for lots of people, such as carpenters, machinists, surveyors, and many others. So what? The fact that these people had trouble with arithmetic, algebra, trig, and so on does not seem to me to be a good argument for dispensing with arithmetic in the lower grades. Examples? Which makes them no less errors. When I saw them, I inquired into his arithmetic capabilities, thinking he might be one of those unfortunates who made it all the way through the US education system without being able to instantly recognize that 10/2 is NOT 8 or that 10*5 is NOT 3. 21. Jun 5, 2013 ### lurflurf I may misunderstand your position, but if I do it is a sincere misunderstanding. I will clarify my own. Hand arithmetic is about the least useful skill in the world. I have tried to frame my opinion with factors of ten. A disagreement less than a factor of ten is not significant. Still my position is quite moderate. My experience and typical educational practices suggest reasonable guidelines. A student should practice arithmetic for somewhere between 25.7 and 257 hours in their formative years. A reasonable amount of practice for a day is between 100 and 1000 1 digit multiplies (some days other skills than multiplication would be practiced). Single digit multiplies (ie 7*8=56) are a fair metric, they are practiced alone at first then used to effect more digits (ie 97243*345=33548835 would require 15 single digit multiplies), fractions, algebra and so forth. I think 1000 1 digit multiplies (by hand) in a day is silly, but I have seen it assigned, still it is hard to object strongly as it takes less than ten minutes without "knowing your times tables". Improving on this seems not worthwhile. I thought speed was the point of all this "drill and kill" talk, but you make an interesting point that speed is not the goal but rather to make multiplication automatic so that the brain can take on more complicated tasks. I would be interested in any studies on this point. I think the analogy is flawed sports and music involve timing, speed, and automatic response in a way mathematics does not. A few second stumble can ruin a musical or sport performance, taking 74 seconds to solve a 70 second math problem is inconsequential. Also mathematics is a thinking pursuit, thinking for a few seconds even about a trivial detail is very in keeping with the spirit. Arithmetic is not of key importance in solving problems. I think we can agree students should practice arithmetic a bit and if they meet your standards whatever they are all the better. If a student practices arithmetic a bit and was still bad I would give them a calculator and tell them to move on to more important subjects without giving it a second thought. As I understand it you would tell them to keep working on it and wait until they get it to move on. That seems crazy to me. You claim such practice is not at the expense of understanding, but time spent is rote practice is not spent gaining understanding. I cannot see how the two are not in opposition. In the past many results could only be reached by a select few who calculated fast, accurately, and for a long time. Machines allow the average person to reach those heights as well as higher heights that no human could reach by hand calculations. Arithmetic facts are not the foundation of mathematics, they are trivial. No deep results follow from 4*7=28. 22. Jun 5, 2013 ### lurflurf Pentium chip proper practice (after first considering if division could be avoided all together or if ignoring the error was safe) would be to catch divides with errors and shift them into an error free zone. Sure there is a performance penalty, but sometimes using a known defective twenty year old chip has consequences. In general calculations should be performed at least twice as independently as possible (preferably on different computer lines). Sometimes results can checked cheaply. Machine arithmetic is a skill worth learning unlike hand arithmetic. In any case hand calculations need to be checked as well and machine calculations are very cheap. In terms of performance a defective Pentium beats any human. You have no problem with people using a calculator to perform arithmetic operations, provided that they are able to do them by hand? Why would one learn a skill to not use it? There is a cost to learning a skill. You buy a book, take a class, spend time, give up learning some other skill instead. Calculating square roots by hand is not a very practical skill. "Those times when a calculator is not available" is a straw man. Get a spare, get a spare for the spare, carry extra batteries, work in an office with extra computers, have a backup generator, and so on. Very few calculations can be done by hand in reasonable time. Even an expert in hand calculation will be dead in the water when serious work is to be done. As far a examples of flawed methodologies this article sumarized well the problem "Is it possible that their main difficulties are restricted to numeracy and in mental computation and do not involve the entire domain of mathematics? Little can be said about mathematical skills that require few computations, but more logic because almost nobody tries to teach these kinds of skills to students with Down syndrome. On the contrary, teachers insist on building up "the basics", which fall down as soon as the pupil gives an answer that makes no sense and then the teachers start all over again." The obvious thing to do when a student has problems in arithmetic is to teach them other things, yet this is not done. Also A new and complete system of arithmetic, composed for the use of the citizens of the United States by Nicolas Pike and William Shanks. 23. Jun 5, 2013 ### pwsnafu Question to lurflurf: IYO should we teach integration techniques despite the fact that a computer is superior to a student? 24. Jun 5, 2013 ### lurflurf I would say integration techniques should be taught in that each illustrates an important idea and they facilitate easy examples like sin(3x). I think practicing them much (especially complicated examples) is a waste of time. For example calculating $$\int \! (\tan (x))^{(37/59)} \, \mathop{dx}$$ Is not very instructive. 25. Jun 5, 2013 ### Mark44 ### Staff: Mentor Why would anyone use a 20-year old CPU that doesn't work right? You have completely misunderstood my point. The flaw in the first Pentium processors was discovered in 1994 or thereabouts, and was widely publicized at the time. The manufacturer, Intel, spent ~$1,000,000,000 recalling these chips. I would be very surprised to hear that anyone is still using one of these problematic chips, so advice on how to work around the problem is not useful, IMO.

If we bring things up to the current time, and find that we have a CPU that is unable to perform some kinds of arithmetic, your advice still doesn't seem very good. If an application user requires some division to be performed, how can it be avoided? Certainly a/b is the same as a * (1/b), but we're still going to need to divide 1 by b.

Since the error in the old Pentium chips produced some results that were incorrect in the 5th decimal place and beyond, if your precision requirements were loose enough, then yes, you could ignore the error.

As far as shifting errors to an "error-free zone" I have no idea what you mean by this.

This is silly. Your assertion that 1,000,000 incorrect answers is somehow better than 1 correct answer obtained in that time is nonsensical. You don't seriously believe this, do you?
Correct.
The point is that they would use it. For example, when I put gas in my motorcycle this morning, I did a mental division of the miles I had gone since I filled up the last time, and the number of gallons I put in. For those people who are so ill-educated that they don't know single-digit products, this would be an impossibility. In fact, most order-of-magnitude estimates would be over their heads.
It isn't now, but it used to be when I was taught it.
Speaking for myself, there are many times when I don't have a calculator, so that is NOT a straw man argument. (I suspect that you don't understand what this terms means.) Despite our best intentions, there are times for most people when there is no device available or charged or working.
The quote above is talking about teaching people with Down syndrome, so how is it germane to the discussion? I am talking about standards that would apply to most students, not a small minority of folks with lower mental capacity.
You seem to find this funny, as I think you have mentioned it three times. The actual arithmetic in it seems OK to me. The parts that are less useful, IMO, are the parts that involve conversion from pints to pecks and the like.

Last edited: Jun 5, 2013