Ad 2a) It's just that the singularity of the integrand at ##y=x_1## is integrable. If ##V'(x_1) \neq 0##, then you have
$$V(y)=V(x_1) + (y-x_1) V'(x_1) + \mathcal{O}[(y-x_1)^2],$$
and thus around the singularity the integral behaves like
$$\Delta t_{\epsilon}=\int_{x_1-\epsilon}^{x_1} \mathrm{d} y \sqrt{\frac{m}{-2V'(x_1)(y-x_1)}}.$$
Since ##V'(x_1)>0## you have
$$\Delta t_{\epsilon}=-\sqrt{2mV'(x_1) (x_1-y)}|_{y=x_1-\epsilon}^{x_1}= \sqrt{\frac{2 m}{V'(x_1)} \epsilon}.$$
The total time is
$$t=\int_{x_0}^{x_1-\epsilon} \mathrm{d} y \sqrt{\frac{m}{2 [E_0-V(y)]}} + \Delta t_{\epsilon}.$$
Since ##t## doesn't depend on ##\epsilon## and ##\Delta t_{\epsilon} \rightarrow 0## for ##\epsilon \rightarrow 0## the total time is finite.
If ##V'(x_1)=0##, the above Taylor expansion starts at best with the quadratic term, i.e.,
$$V(y)=V(x_1) + \frac{1}{2} (y-x_1)^2 V''(x_1) + \mathcal{O}[(y-x_1)^3],$$
and the above analysis shows that ##\Delta t_{\epsilon}## diverges logarithmically, i.e., even in this case the time ##t \rightarrow \infty##. If also ##V''(x_1)=0## the divergence gets worse, i.e., for ##V'(x_1)=0## the time always ##t \rightarrow \infty##.