What exactly is your question? I suspect it is to find \lim_{n\to \infty} a_n where a_n = (27+ \pi) \left(\frac{1}{\sqrt{1}} + \frac{1}{\sqrt{2}} + ... + \frac{1}{\sqrt{n}}\right).
If so, there are several ways of doing it, of which I will only show you two, the first being elegant, but the second is more general (and proves your p-series rules as well). I will also leave out the factor of ( 27 + \pi) as multiplication by a constant is of no concern to us.
---------------
The first is to compare it to a sequence which is term by term smaller (except for the last term). Let this sequence be b_n = \sqrt{n}= \frac{n}{\sqrt{n}} = \left( \frac{1}{\sqrt{n}} + \frac{1}{\sqrt{n}} + ...\frac{1}{\sqrt{n}} \right) \leftarrow \mbox{ n terms}.
We can see that since the denominator of each term (except for the last one) in b_n is larger than the denominator of each term in a_n, a_n > b_n.
But \lim_{n\to \infty} b_n = \lim_{n\to \infty} \sqrt{n}, which diverges. Hence, a_n also diverges.
-----------
The more general solution is to compare the series to its integral.
Take the more general problem, of finding the convergence of the series \sum_{t=1}^{\infty} \frac{1}{t^a} = \zeta (a). It is obvious that if a \leq 0, \zeta (a) \to \infty}. So let's restrict this problem to a > 0.
If we construct a graph of y = 1/x^a, we see that it is a monotonic decreasing function, the general shape being similar to that of a rectangular hyperbola. Now, on the graph, construct lines at x= n-1, x= n, and x = n+1, where n is an unknown real number greater than 1.
Now, Construct a rectangle with vertices (n-1,0), (n,0), (n-1, \frac{1}{n^a}) , (n, \frac{1}{n^a}). Now, geometrically, the area of the rectangle is less than the area under the curve between x= n-1 and x=n, and more than the area under the curve between x=n+1 and x=n. More over, the area of the rectangle is 1/n^a. Algebraically, this follows from the fact the function is monotonically decreasing. Either way, interpreting the area under the curve by an integral, we arrive at the following inequality:
\int^{n+1}_n \frac{dx}{x^a} < \frac{1}{n^a} < \int^n_{n-1} \frac{dx}{x^a}.
Summing this inequality for n=2, n=3...n=m, then adding 1 to each expression, we obtain*;
1+ \int^{m+1}_2 \frac{dx}{x^a} < \sum_{k=1}^m \frac{1}{k^a} < 1 + \int^m_1 \frac{dx}{x^a}.
If we let m increase beyond all bounds, the summation term approaches \zeta (a), the original series in question. We must consider 3 separate cases when evaluating the integral directly with the Fundamental Theorem of Calculus- 0 < a < 1, a= 1 and a> 1.
Considering each case separately;
If 0 < a < 1, then in the anti-derivatives the limits of integration are substituted into the numerator. Since m is increasing beyond all bounds, so is the numerator and hence so are both integrals.
So \zeta (a) does not converge to any finite limit if 0 < a < 1.
If a=1, then the integral on the left evaluates to \log_e \left( \frac{m+1}{2} \right) whilst the integral on the right evaluates** to \log_e m, which both increase beyond any finite bounds as m increases.
If a > 1, then the limits of integration get substituted into the denominator of the anti-derivative, and as the upper limit increases, the integrals tend to definite limits which we substitute into the equality;
1 + \frac{1}{2^{a-1}\cdot (a-1)} < \zeta (a) < \frac{a}{a-1},
if and only if a > 1.**
We still need to rule out the possibility that the sum still does not tend to a limit, though it may not diverge to positive or negative infinity it may oscillate between the two sides of the inequality. However, since each term is positive and decreasing, this is not possible.
We may finally conclude that \zeta (s) tends to a limit
if and only if s > 1.
-----------------
*To prove this result rigorously rather than relying on geometric intuitions, we can evaluate those integrals directly and prove the inequality that way, but with that method there is no obvious motivation for doing that step, so it seems like one is only doing it because one already knows the result before hand. Geometrically, we can see a motivation, hence it is best to do it geometrically, then prove it rigorously afterwards.
---------------------
** I specifically use the word "evaluates" rather than my usual preference of "is defined to be". Normally I define the natural logarithm by \log_e x= \int^x_1 \frac{dt}{t}, however in this proof we needed to use the property that as x goes to infinity, so does its logarithm. Using my usual definition would then make this argument circular. So instead, in this proof I define the natural logarithm to be the inverse function of the exponential function, which I define as e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!} from which we can easily see that it is a monotonically increasing function with no upper bound, and hence so is its inverse.
-------------
***From this, we do not merely achieve the conditions of convergence for such series, but we can also estimate the value of the series, with bounds on the error. A crude estimate would be the arithmetic mean of the left and right sides of the inequality. We get a pretty good estimate from the weighted geometric mean \sqrt_n{p^2q} where p is the lower bound and q is the upper, but to justify this estimate requires some more in-depth numerical analysis.
An easy application of this is to substitute the case where a=1. Then we can see that \frac{3}{2} < \zeta (2) < 2. Indeed, if we were to take the Fourier series of the second Bernoulli polynomial B_2(t) and substitute t=0, we would arrive at the result \zeta (2) = \frac{\pi^2}{6}, which one can verify for themselves satisfies the inequality. The error when using the arithmetic mean of the left and right sides for this case is approximately 0.105, which I personally think is quite good taking into account of all the more precise calculations we could have done, though omitted for the sake of the already lost brevity of this post. The error also decreases quite rapidly as a increases, thanks to the exponential in the denominator of the lower bound.
------------------------
OK! Out of textbook writers mode, though it has been fun =]
Maybe I should write a textbook

Or we could put this in the tutorial section as a p-series tutorial !

Anyway, I just hope someone learns something out of this post or else I just wasted a whole heap of time ! To everyone else, ratings of rigor are welcome =]