William Lowell Putnam Competition

  • Thread starter Thread starter vsage
  • Start date Start date
  • Tags Tags
    Competition
vsage
Has anyone competed in or been part of an institution that participated in this competition? My school doesn't appear to have been involved in it past 2001 which is a shame because I really would like to pit myself against people from other schools or heck just challenge myself. Anyway I came across a problems archive and was doing a few practice problems so really this post is less about the competition and more about whether I am ready. This was question 1 on the 1995 test and I think I have a solution but I'm not sure it is "rigorous": (solution to problem A-1 located at http://www.unl.edu/amc/a-activities/a7-problems/putnam/-pdf/1995.pdf )

Let a, b, c \in T
abc \in T
(ab)c \in T

Let d, e, f \in U
def \in U
d(ef) \in U

Assume (ab) \in U
then (ab)ef \in U
ab(ef) \in U

For this to be true a, b \in U but since U and T are disjoint this is a contradiction so ab \in T
Let g = ab \in T

gc \in T

Is it proven? Please pardon the bad LaTex I will edit this post if it doesn't come out right. Well come to think of it I don't need that g = ab part right?
 
Last edited by a moderator:
Mathematics news on Phys.org
well u almost nailed it if i am right in interpreting your proof ...

If u like solving putnam problems i would recommend u to see ...
1> www.kalva.demon.co.uk[/URL]

Also check rec.puzzles where many of the putnam problems get solved ...

-- AI
 
Last edited by a moderator:
Sorry, seems not proven to me.

First, a little unrelated matter: you denote the sets T, U as if they were finite. This is not possible, unless they only contain the elements 1 and/or 0. Otherwise, if there are 2 elements >1 (or <-1), multiply the two largest to get something larger than anything in the set; if there are 2 elements between -1 and 1, multiply the two of smallest absolute value to get a smaller still.

Now, if ab(ef) is in U and (ef) is in U, it does not follow that a,b are in U - or it needs more work to do so.
Finally, showing that if g=ab is in T, then for all c, gc is also in T, is not sufficient. You must show this for all g, or otherwise show that all elements in T can be written as a product of two other elements in T.

Here's a simpler proof:
Wolog T is not closed under multiplication, then there exist a,b in T s.t. ab is not in T. Since ab is in R but not T, ab is in U.
Assume that U is also not closed under multiplication. Then there exists c,d in U s.t. cd is in T (as before).
Now a,b and (cd) are in T, hence abcd is in T. But c,d and (ab) are in U, hence abcd is in U. This contradicts the fact that T,U are disjoint. [/color]
Heh. I can't believe I got this... someone prove me wrong, or I might regret not writing the Putnam!
 
zefram_c said:
Sorry, seems not proven to me.

First, a little unrelated matter: you denote the sets T, U as if they were finite. This is not possible, unless they only contain the elements 1 and/or 0. Otherwise, if there are 2 elements >1 (or <-1), multiply the two largest to get something larger than anything in the set; if there are 2 elements between -1 and 1, multiply the two of smallest absolute value to get a smaller still.

Now, if ab(ef) is in U and (ef) is in U, it does not follow that a,b are in U - or it needs more work to do so.
Finally, showing that if g=ab is in T, then for all c, gc is also in T, is not sufficient. You must show this for all g, or otherwise show that all elements in T can be written as a product of two other elements in T.

Yes I realized later my definitions of U and T were ultimately incorrect (and unncesessary for the proof). I'm still learning (1st yr college student) and haven't been able to take any set theory or logic classes so I appreciate your input on this matter. I'm rethinking my post right now so I can resubmit for criticism :). Edit: Bah but I really do see the flaw though I assumed for some reason that ef \in U which is dumb because it's assuming what I'm trying to prove!
 
Last edited by a moderator:
OK I am ready to try my luck again! I am really tired now though so I am not sure if I am getting further away from the solution or not.

Suppose there are a, b, c \in T and d, e, f \in U
so abc \in T and def \in U
(The given)

Suppose ab not \in T therefore ab \in U for some a, b \in T because T \bigcup U = S
Suppose ef not \in U therefore ef \in T for some e,f \in U because T \bigcup U = S

then (ab)ef \in U and ab(ef) \in T from supposing the 1st and second lines, respectively.

This implies that (ab)(ef) \in U AND (ab)(ef) \in T
which is a contradiction since U and T are disjoint.

Am I closer? :)
 
Last edited by a moderator:
Am I closer?
If you highlight the seemingly empty space in my previous post, you'll see :smile:
 
zefram_c said:
If you highlight the seemingly empty space in my previous post, you'll see :smile:

Very nice! We have nearly the same proof I see. Thank you so much (also thanks for the link to tenaliraman)
 

Similar threads

Replies
5
Views
2K
Replies
6
Views
4K
Replies
28
Views
6K
4
Replies
175
Views
25K
Replies
10
Views
3K
Back
Top