1. The problem statement, all variables and given/known data Hi everyone! Me and my colleague are working our way through Harold J Larson's "Introduction to Probability Theory and Statistical Inference: Third Edition", and we found something interesting. We both have the same edition of the text, but mine is slightly newer?, and one problem is different between our two identical versions, problem 7 of chapter 3.4. His is: The factorial moment generating function for a discrete random variable Y is ψY(t) = et-1 Find the probability function (cumulative probability function, I presume) for Y. Mine is the same type of question, except, ψY(t) = (et-1)/(e-1) is what I see. I suppose my first question is, how could these both possibly have the same answer / be the same? The solution is the same in both our books. 2. Relevant equations The book gives a number of relevant equations. ψX(t) = E[tX] = E[eXln(t)] = mX(ln(t)) This I guess I follow, since E[X] is just expected value or mean, and mX(t) is the moment generating function. It also says (dk/dtk)ψX(t)|t=0 = k!pX(n), that is, that the kth derivative of the factorial moment generating function evaluated at t=0 is just some constant times the probability function for X. This leads me to suspect this is how we'd go about finding an answer to the question. 3. The attempt at a solution The extent of our attempt to get the answer the back of the book has - pX(k) = (1/k!), k = 1,2,3, . . . has been along these lines: (using his version of the problem) Derivative of et-1 is just et-1. Evaluated at t=0, you get e-1 Do the same for any subsequent derivatives and the answer is of course the same. Nowhere is k! coming up. (using my version of the problem) Derivative of (et-1)/(e-1) is just et/(e-1). Evaluated at t=0, you get 1/(e-1). Do the same for any subsequent derivatives and the answer is of course the same. Where is the k! coming in? An explanation of either of our versions of the problem would be much appreciated, and/or an explanation of how they're really the same (I'm hoping that one of our books isn't just plain wrong).