Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

William Lowell Putnam Competition

  1. Oct 16, 2004 #1
    Has anyone competed in or been part of an institution that participated in this competition? My school doesn't appear to have been involved in it past 2001 which is a shame because I really would like to pit myself against people from other schools or heck just challenge myself. Anyway I came across a problems archive and was doing a few practice problems so really this post is less about the competition and more about whether I am ready. This was question 1 on the 1995 test and I think I have a solution but I'm not sure it is "rigorous": (solution to problem A-1 located at http://www.unl.edu/amc/a-activities/a7-problems/putnam/-pdf/1995.pdf )

    Let [tex]a, b, c \in T[/tex]
    [tex]abc \in T[/tex]
    [tex](ab)c \in T[/tex]

    Let [tex]d, e, f \in U[/tex]
    [tex]def \in U[/tex]
    [tex]d(ef) \in U[/tex]

    Assume [tex](ab) \in U[/tex]
    then [tex](ab)ef \in U[/tex]
    [tex]ab(ef) \in U[/tex]

    For this to be true [tex] a, b \in U[/tex] but since U and T are disjoint this is a contradiction so [tex] ab \in T[/tex]
    [tex]Let g = ab \in T[/tex]

    [tex]gc \in T[/tex]

    Is it proven? Please pardon the bad LaTex I will edit this post if it doesn't come out right. Well come to think of it I don't need that g = ab part right?
    Last edited by a moderator: Oct 17, 2004
  2. jcsd
  3. Oct 17, 2004 #2
    well u almost nailed it if i am right in interpreting your proof ....

    If u like solving putnam problems i would recommend u to see ...
    1> www.kalva.demon.co.uk

    Also check rec.puzzles where many of the putnam problems get solved ...

    -- AI
  4. Oct 17, 2004 #3
    Sorry, seems not proven to me.

    First, a little unrelated matter: you denote the sets T, U as if they were finite. This is not possible, unless they only contain the elements 1 and/or 0. Otherwise, if there are 2 elements >1 (or <-1), multiply the two largest to get something larger than anything in the set; if there are 2 elements between -1 and 1, multiply the two of smallest absolute value to get a smaller still.

    Now, if ab(ef) is in U and (ef) is in U, it does not follow that a,b are in U - or it needs more work to do so.
    Finally, showing that if g=ab is in T, then for all c, gc is also in T, is not sufficient. You must show this for all g, or otherwise show that all elements in T can be written as a product of two other elements in T.

    Here's a simpler proof:
    Wolog T is not closed under multiplication, then there exist a,b in T s.t. ab is not in T. Since ab is in R but not T, ab is in U.
    Assume that U is also not closed under multiplication. Then there exists c,d in U s.t. cd is in T (as before).
    Now a,b and (cd) are in T, hence abcd is in T. But c,d and (ab) are in U, hence abcd is in U. This contradicts the fact that T,U are disjoint.

    Heh. I can't believe I got this... someone prove me wrong, or I might regret not writing the Putnam!
  5. Oct 17, 2004 #4
    Yes I realized later my definitions of U and T were ultimately incorrect (and unncesessary for the proof). I'm still learning (1st yr college student) and haven't been able to take any set theory or logic classes so I appreciate your input on this matter. I'm rethinking my post right now so I can resubmit for criticism :). Edit: Bah but I really do see the flaw though I assumed for some reason that [tex] ef \in U[/tex] which is dumb because it's assuming what I'm trying to prove!
    Last edited by a moderator: Oct 17, 2004
  6. Oct 17, 2004 #5
    OK I am ready to try my luck again! I am really tired now though so I am not sure if I am getting further away from the solution or not.

    Suppose there are [tex]a, b, c \in T[/tex] and [tex]d, e, f \in U[/tex]
    so [tex]abc \in T[/tex] and [tex]def \in U[/tex]
    (The given)

    Suppose [tex]ab not \in T[/tex] therefore [tex] ab \in U[/tex] for some [tex]a, b \in T[/tex] because [tex]T \bigcup U = S[/tex]
    Suppose [tex]ef not \in U[/tex] therefore [tex] ef \in T[/tex] for some [tex]e,f \in U[/tex] because [tex]T \bigcup U = S[/tex]

    then [tex](ab)ef \in U[/tex] and [tex]ab(ef) \in T[/tex] from supposing the 1st and second lines, respectively.

    This implies that [tex](ab)(ef) \in U[/tex] AND [tex](ab)(ef) \in T[/tex]
    which is a contradiction since U and T are disjoint.

    Am I closer? :)
    Last edited by a moderator: Oct 17, 2004
  7. Oct 17, 2004 #6
    If you highlight the seemingly empty space in my previous post, you'll see :smile:
  8. Oct 17, 2004 #7
    Very nice! We have nearly the same proof I see. Thank you so much (also thanks for the link to tenaliraman)
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: William Lowell Putnam Competition