## proving combinatorics identities

Is it always possible to prove combinatorial identities in a brute force way, as opposed to the preferred way of seeing that the RHS and LHS are two different ways of counting the same thing? For example, the identity

$$\left (^{n-1}_{k-1}\right) + \left (^{n-1}_{k}\right) = \left (^{n}_{k}\right)$$

can be seen either from the counting interpretation of 'n choose k' or by writing the binomial coefficients explicitly in terms of factorials and adding fractions, which is the brute force way. In the slightly more complicated identity

$$\sum_{k=0}^{m}\left (^{a}_{k}\right) \left (^{b}_{m-k}\right) = \left (^{a+b}_{m}\right)$$

can you prove that without using the fact that the RHS and the LHS are two ways of counting the number of ways of choosing m objects from a+b objects?
 PhysOrg.com mathematics news on PhysOrg.com >> Pendulum swings back on 350-year-old mathematical mystery>> Bayesian statistics theorem holds its own - but use with caution>> Math technique de-clutters cancer-cell data, revealing tumor evolution, treatment leads
 The proof isn't too hard if you set it up right. Consider two sets $A$ and $B$ [edit: two disjoint sets] where $|A| = a$ and $|B| = b$. We want to choose $m$ elements from the set $A \cup B$. If we want $k$ of the elements to be from $A$, how many options do we have?
 Yes, that is the combinatorial proof, which I am aware of (sorry for not being clear about that). Is there a more brute force, algebraic sort of way to prove it? Preferably an approach that works generally for combinatorial identities. I know that combinatorial proofs are preferable because they give the most insight, but what about when you can't think of a combinatorial proof? Is there a general way to proceed? For example, this identity $$\sum_{k=1}^{n}\frac{\left( ^{2n-2k}_{n-k}\right)\left( ^{2k}_{k}\right)}{2k-1}=\left( ^{2n}_{n}\right)$$ seems to be true, but I don't immediately see a combinatorial argument for it. I wonder how the symbolic math solvers do it. Maybe it's just all pre-programmed lookup tables.

## proving combinatorics identities

You could use induction.

 Quote by daveyinaz You could use induction.
I've tried to prove that last identity by induction but couldn't figure out how. Is it generally possible to prove these identities by induction? Thanks.

 Quote by Adeimantus Is it always possible to prove combinatorial identities in a brute force way, as opposed to the preferred way of seeing that the RHS and LHS are two different ways of counting the same thing? For example, the identity $$\left (^{n-1}_{k-1}\right) + \left (^{n-1}_{k}\right) = \left (^{n}_{k}\right)$$ can be seen either from the counting interpretation of 'n choose k' or by writing the binomial coefficients explicitly in terms of factorials and adding fractions, which is the brute force way. In the slightly more complicated identity $$\sum_{k=0}^{m}\left (^{a}_{k}\right) \left (^{b}_{m-k}\right) = \left (^{a+b}_{m}\right)$$ can you prove that without using the fact that the RHS and the LHS are two ways of counting the number of ways of choosing m objects from a+b objects?
The second identity can be proven by first recognizing that

$$(x+y)^n=\sum_{k=0}^{n} x^{k} y^{n-k} \left _n C _k$$

then use the fact that

$$(x+y)^{a+b}=(x+y)^{a} (x+y)^{b}$$

Expand the terms using the summation form, then compare like terms to obtain your identity.
 What you are looking for is called "generating functions" and it's a topic from the larger field of Algebraic Combinatorics, whose aim is to develop systematic methods (even algorithmic) to prove these identities. This paper is a good introduction, if you're interested: http://www.math.rutgers.edu/~zeilber...PDF/enuPCM.pdf Another one is Wilf and Zeiberger's book A = B, available here: http://www.math.upenn.edu/~wilf/AeqB.html

 Quote by gburdell1 The second identity can be proven by first recognizing that $$(x+y)^n=\sum_{k=0}^{n} x^{k} y^{n-k} \left _n C _k$$ then use the fact that $$(x+y)^{a+b}=(x+y)^{a} (x+y)^{b}$$ Expand the terms using the summation form, then compare like terms to obtain your identity.

Okay, that makes a lot of sense. Thank you! I think this is an example of the generating function approach recommended by JSuarez. And thanks for digging up this post of mine. I had recently come back to counting, this time in connection with random walks. So it is perfect timing for me to think about these things again.

 Quote by JSuarez What you are looking for is called "generating functions" and it's a topic from the larger field of Algebraic Combinatorics, whose aim is to develop systematic methods (even algorithmic) to prove these identities. This paper is a good introduction, if you're interested: http://www.math.rutgers.edu/~zeilber...PDF/enuPCM.pdf Another one is Wilf and Zeiberger's book A = B, available here: http://www.math.upenn.edu/~wilf/AeqB.html
Awesome. Thank you for those excellent (and free ) resources!

 I think this is an example of the generating function
It is. The binomial theorem, in the form $\left(1+x\right)^n$, is the generating function for the binomial numbers with integer upper index $n \geq 0$ (and this also applies to more general binomial numbers).

One more thing: if the problem you are working at involves counting paths with specific properties, then you will need generating functions, and I can recommend you another source (not free): Stanley's "Enumerative Combinatorics", particularly vol. I.

 Quote by JSuarez One more thing: if the problem you are working at involves counting paths with specific properties, then you will need generating functions, and I can recommend you another source (not free): Stanley's "Enumerative Combinatorics", particularly vol. I.
Cool. I'll see if I can get it through interlibrary loan. I am trying to think of other ways to prove the result that for paths of length 2n, the number of paths with 2k edges above the time axis is the same as the number of paths whose last return to zero is at time 2k. They both have the discrete arcsine distribution. Feller, Vol I. has a neat inductive proof, but I would really like a bijective proof or even one using generating functions.

This is identity 3.92 on p. 37 of

Henry W. Gould, Combinatorial Identities, 1972

HF

 Quote by Adeimantus Yes, that is the combinatorial proof, which I am aware of (sorry for not being clear about that). Is there a more brute force, algebraic sort of way to prove it? Preferably an approach that works generally for combinatorial identities. I know that combinatorial proofs are preferable because they give the most insight, but what about when you can't think of a combinatorial proof? Is there a general way to proceed? For example, this identity $$\sum_{k=1}^{n}\frac{\left( ^{2n-2k}_{n-k}\right)\left( ^{2k}_{k}\right)}{2k-1}=\left( ^{2n}_{n}\right)$$ seems to be true, but I don't immediately see a combinatorial argument for it. I wonder how the symbolic math solvers do it. Maybe it's just all pre-programmed lookup tables.
 i don't understand what the problem is? the identities presented in the OP are proven by writing down what the choose coefficients are in terms of factorial and manipulating a little aren't they?
 Another resource is Wilf's book "generatingfunctionology", which can be downloaded for free: http://www.math.upenn.edu/~wilf/DownldGF.html It doesn't cover as much material as Stanley's book, but it's more fun. Wilf's enthusiasm is contagious.

 Quote by Adeimantus In the slightly more complicated identity $$\sum_{k=0}^{m}\left (^{a}_{k}\right) \left (^{b}_{m-k}\right) = \left (^{a+b}_{m}\right)$$ can you prove that without using the fact that the RHS and the LHS are two ways of counting the number of ways of choosing m objects from a+b objects?
gburdell1 already gave a generating function proof of this one. Here is an example of using generating functions to prove a similar but slightly harder identity:

$$\sum_{k=0}^{n}\binom{k}{r}\binom{n-k}{m-r} = \binom{n+1}{m+1}$$

First define the sequence of generating functions

$$f_r(x) = \sum_{k \geq 0} \binom{k}{r}x^k$$

and note that the sum in question is just the coefficient of x^n in fr(x) fm-r(x):

$$\sum_{k=0}^{n}\binom{k}{r}\binom{n-k}{m-r} = [x^n]f_r(x)f_{m-r}(x)$$

To find the form of the f_r, find a recurrence relation that they satisfy.

$$f_r(x) = \sum_{k \geq 0} \binom{k}{r}x^k = \sum_{k \geq 1} \left[ \binom{k-1}{r-1}+\binom{k-1}{r} \right]x^k$$ for r>0

$$f_r(x) = x(f_{r-1}(x)+f_r(x))$$ or

$$f_r(x) = \frac{x}{1-x}f_{r-1}(x)$$ for r>0

Note that f_0(x) = 1+x+x^2 + .... = 1/(1-x). Therefore the solution to the recurrence is

$$f_r(x) = \frac{x^r}{(1-x)^{r+1}}$$

Going back to the combinatoric identity, we are looking for

$$[x^n]f_r(x)f_{m-r}(x) = [x^n]\frac{x^m}{(1-x)^{m+2}}$$, which is independent of r.

$$[x^n]f_r(x)f_{m-r}(x) = [x^{n-m}]\frac{1}{(1-x)^{m+2}} = \binom{(n-m)+(m+2)-1}{(m+2)-1}$$

 Quote by ice109 i don't understand what the problem is? the identities presented in the OP are proven by writing down what the choose coefficients are in terms of factorial and manipulating a little aren't they?
If you read the original post, you will see that is exactly what the OP was asking.

EDIT:

For an interpretation of this identity, which I recently encountered, see this thread: