1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Hall's Marriage Theorem

  1. Apr 2, 2009 #1

    I need to show that for a bipartite graph G(X, Y, E), if I have

    |R(A)| = range of a subset A of X = the number of vertices incident to a vertex in A.
    |A| = cardinality of A

    So if I have |R(A)| >= |A| - |X| + t, then there is a matching of size T.

    I thought of the following, but it seems fishy:

    I know that if I want an X matching, then |R(A)| has to equal or be greater than |A|.
    SO if I added |X| - t new vertices to Y and connected those all to my vertices in X, then I know that there will be at least a |X| - t matching, plus whatever matching there existed in the first place (before I added the |X| - t new ones). So then in this case, |R(A)| >= |A| for a full X matching. But suppose I removed those vertices that I just added, then

    |R(A)| - (|X| - t) = |A| - (|X| - t),

    But then I don't know what size of a matching this would give me. Would this give me the matching of the t elements because originally I had a matching of |X|, but now since I am substracting |x| - t elements, I only have |x| - |x| + t or t matching?

  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted