1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is it common for scientists to forget basic math?

  1. May 24, 2010 #1

    Simfish

    User Avatar
    Gold Member

    Basic math, such as...

    Deriving the quadratic equation, taking the derivative of a logarithm in a base other than e, taking integrals of trigonometric substitutions, remembering what a subspace actually means, not remembering how to derive the general solution for a first order differential equation, etc..?

    Of course, it's much quicker to do those things when you look back at them again (and you don't have to struggle to understand anything anymore). But still, sometimes I worry about my memory problems or something.

    Is it even common for them to sometimes forget them multiple times? Like, forgetting to do it once, then forgetting to do it 4 years later?
     
  2. jcsd
  3. May 24, 2010 #2
    I don't know about the average scientist, since I have no data on this. But despite having done none of these things. But I feel pretty confident that I could do any of the things you listed above either immediately or after reviewing for ten minutes or so. Actually on my field theory final last year, I got a differential scattering cross section which had a bunch of trigonometric terms, and I decided to integrate it by hand just for fun (it was a takehome, so time wasn't an issue). Didn't seem to have any problems there. But then, I was also a math major, so I spent a bit more time doing this stuff than your average physicist.
     
  4. May 24, 2010 #3
    Yes, lol, just the other day I forgot it is okay to square both sides of an equation to solve. lol.

    Derivatives and integrals are usually derived on top of their head probably. I do not htink everyone will remember all of them.
     
  5. May 24, 2010 #4
    I think it's natural to forget things you don't use very often or to have the occasional slip-up. I forget negative signs all the time. I'm not sure how often you're using what you mentioned but if it's not very often then I wouldn't worry about it. It's pretty human to have to dust off the cobwebs every once and a while and we all make mistakes, sometimes pretty silly ones.
     
  6. May 24, 2010 #5
    My dad says that every math grad student forgets a big chunk of calculus sometime while working towards his PhD.

    You could have forgotten some stuff due to stress. In this case, review some math a little bit using an old textbook/internet and take a little vacation.
     
  7. May 24, 2010 #6

    Landau

    User Avatar
    Science Advisor

    Well, yes: if you only remember the facts, and don't understand where they come from or why they are true, then you keep forgetting them. If you don't understand WHY you can square both sides of an equation, and are comfortable with some authority telling you "yes it is valid" without question, then in a year or so you'll have no idea again. If you don't understand why a subspace is defined as it is (or why it has the properties it has), you'll forget it again. Etc.
    That's why you should alway ask "why", and derive every result yourself to the last detail. It takes a lot of time, but once you fully understand something, you will probably never forget it again.

    Of course, details like the sum formulae for trigonometric functions are things you will forget if you don't use them regularly. That's ok, as long as you know how to derive them, e.g. using complex exponentials.

    But as a math student, I may be biased.

    I'm pretty sure the average physics student spends more time integrating trigonometric function by hand than the avarage math student.
     
  8. May 24, 2010 #7
    I never mentioned anything about role-learning. I understand the concepts, but drawing them out takes a bit of effort. Besides, in school you just have to accept what they tell you unfortunately, if you spend too much time on the theories, you are going to lag behind and then get poor grades and you won't get into any university. You don't really have a choice (at least in my school)
     
  9. May 24, 2010 #8

    Landau

    User Avatar
    Science Advisor

    You mentioned 'scientists', so I assumed you were talking about university. But ayway, my reply stays the same: if you understand the concepts (well), then you should have no trouble deriving the results, altough it may take some time to fill in the details.

    If you do have trouble deriving the results, repeat: after a month, test yourself again. Also, focus on the main idea. For example, in deriving the solution to the quadratic equation, there's only one word you have to remember: completing the square. The algebraic details are not important, you only need to know this 'trick' works.

    \\edit: I now realize bignum is not the same person as OP :)
     
    Last edited: May 24, 2010
  10. May 24, 2010 #9
    That's true in physics and probably even more so in engineering. We just have to accept things mathematicians discover and apply them, for the most part. However, if you work on a math minor, as I am, you kind of get to see somewhat the world of mathematicians, which is largely not a computational world. My vector analysis professor liked to just start deriving formulas from first principles in class when he couldn't recall something off the top of his head.
     
  11. May 24, 2010 #10
    I usually forget.
    That's why it's always more interesting remembering how to derive things instead of the equations itself.
     
  12. May 24, 2010 #11
    Actually I am going for a double major math and physics, so I know where you are coming from when you say people just accept what they learn. It happens all the time, but the build up from high school forces people to do so, it is not really our choice. Look at me for instance, my grades suck because I spend all my time building intuitions (going beyond the curriculum) instead of following what the curriculum sets for us and I lag behind (I know this is very difficult to understand). For quadratics, I am a very lazy person, I just plug it into my graphic calculator and solve, lol.
     
  13. May 24, 2010 #12

    Hurkyl

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Just to emphasize what everyone has been saying about learning the "why"... did you remember that squaring both sides can introduce new solutions that aren't solutions to the original equation?
     
  14. May 24, 2010 #13
    Not sure what you meant, but when i thought about it, I went back to the definition of squaring. Multiplying the number by itself to both sides, but obviously I am introducing two new numbers which aren't equal on both sides. I ended up multiplying it by its conjugate.

    Is that you meant?
     
  15. May 24, 2010 #14

    Hurkyl

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    A lot of algebraic manipulations to equations are "reversible" -- for example, adding equations:
    Theorem: if "a=b", then the two equations "c=d" and "c+a=d+b" have the same solution set​

    Other operations aren't reversible -- squaring is one of them. We have the following theorem:
    Theorem: Any solution to "a=b" is a solution to "a²=b²"​
    but not the following:
    NotATheorem: Any solution to "a²=b²" is a solution to "a=b"​

    For example, x=1 has only one solution for x, but x²=1 has two solutions for x.


    Most operations you can do to an equation are not reversible, but most the most useful one are -- e.g. adding a constant, or multiplying by a non-zero constant. The operations of multiplying by a variable or squaring are notorious, because they aren't reversible, and beginners tend to forget. e.g. many 1=0 "proofs" involve trying to reverse one of those operations.
     
  16. May 24, 2010 #15
    In the second theorem couldn't you view the additional a and b on their respective sides as an "instantaneous constant"? And so that's why it's reversible? I hope that makes sense.
     
  17. May 25, 2010 #16
    When do you ever fully understand something? Russell spent a hundred pages of his Principia proving that 1+1=2, and people very much question his proofs. So even if you went as far as deriving everything as far down as far as Russell for every part of mathematics, you would still never fully understand the mathematics you use. Rememeber Socrates - we know *nothing* (not exactly, not fully). Physics students need only go through "Maths for Physicists" not read a book case full of equivalent Pure Maths books. They don't have to prove everything, what are Mathematicians for?! You forget things you fulkly understand anyway. You fully unsderstand simple French sentences, like "The sun is setting over the library". One timne you could perhaps translate this, but you can't now (I bet!) Or history dates. Knowing the date *is* understanding it, but I bet you forgot a whole pile of them.
     
  18. May 25, 2010 #17
    Every baby understands 1+1=2, not being able to prove it from a given set of axioms do not mean that you do not understand why it is true.

    Btw, you don't understand what understanding means...
    That is not what you mean by understanding in maths. Understanding a language means that you memorized the words, understanding maths is very different from memorizing formulas. You understand parts of maths once everything becomes so clear that it you intuitively feel that it can't possibly be in any other way.

    And when you learn a language well enough so that you think in it rather than translating to yourself you will never forget it.
     
    Last edited: May 25, 2010
  19. May 26, 2010 #18

    MathematicalPhysicist

    User Avatar
    Gold Member

    It's true by definition, so what there is to understand in 1+1=2?
    For me I+I=II is more intuitive than 1+1=2.
     
  20. May 26, 2010 #19
    II=2 by definition...

    And the reason mathematicians try to prove it in elaborate ways is because they want an as slim axiom system as possible.
     
  21. May 26, 2010 #20

    MathematicalPhysicist

    User Avatar
    Gold Member

    Didn't say otherwise, though I don't see anyway that proves it, even in the notion of set theory you don't prove it, I mean you define the natural numbers from adding to the empty set itself and a set that contains the empty set and such etc, just another way to look at it, it doesn't prove it thought.
    How can you prove a definition?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Is it common for scientists to forget basic math?
  1. Maths, Back to basics! (Replies: 3)

Loading...