I wrote the following python script in order to try and look how $$\sum_{i=1}^\infty f(n) = \sum_{i=1}^\infty \frac{\Gamma(n+1/2)}{n^2 \Gamma(n)}$$ behaves. However it appears to have some problem with my function f(n) for n=172 and above throwing this error:

I thought that there was a problem with the function [as division with zero] and so I tried wolframalpha which returns me a non zero value for [itex]\Gamma(172)[/itex] and also a non-infinite value for f(172).
I even tried a user-defined gamma function I had built in the past, and I got the same result for n=172... [it's highly unlike that my method's the same as scipy's]

Code (Python):

from scipy.specialimport gamma

def f(n): return gamma(n+0.5)/(n**2 * gamma(n))

def sume(f,N):
result=0.0 for i inrange(1,N+1): result+= f(i) return result

print f(172) '''
for i in range(100,200):
print i," : ",sume(f,i)
'''

gamma(172) = 171! is a huge number. It is probably just too large for the floating point double precision number format of the computer. The limit is about 1.8x10^{308}. gamma(172) is about 1.2x10^{309}. So that is where the number gets too large for the computer to handle.