# Hall's Marriage Theorem

1. Apr 2, 2009

### bodensee9

Hello,

I need to show that for a bipartite graph G(X, Y, E), if I have

|R(A)| = range of a subset A of X = the number of vertices incident to a vertex in A.
|A| = cardinality of A

So if I have |R(A)| >= |A| - |X| + t, then there is a matching of size T.

I thought of the following, but it seems fishy:

I know that if I want an X matching, then |R(A)| has to equal or be greater than |A|.
SO if I added |X| - t new vertices to Y and connected those all to my vertices in X, then I know that there will be at least a |X| - t matching, plus whatever matching there existed in the first place (before I added the |X| - t new ones). So then in this case, |R(A)| >= |A| for a full X matching. But suppose I removed those vertices that I just added, then

|R(A)| - (|X| - t) = |A| - (|X| - t),

But then I don't know what size of a matching this would give me. Would this give me the matching of the t elements because originally I had a matching of |X|, but now since I am substracting |x| - t elements, I only have |x| - |x| + t or t matching?

Thanks.