$\mu :=\frac{a+b+c}{3}$
the inequality is equivalent to proving
$\frac{1}{2\mu^2} = \frac{1}{\mu(\mu + \mu)} \leq \frac{1}{3}\Big(\frac{1}{b(a+b)} + \frac{1}{c(b+c)} + \frac{1}{a(c+a)}\Big)$ if we consider the function $f: \mathbb R^3 \mapsto \mathbb R$ given by
$f\big(\mathbf x\big) = \frac{1}{x_2(x_1 + x_2)}$
we can examine its Hessian
$\mathbf H =\left[\begin{matrix}\frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} & \frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{1}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} & 0\\\frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{1}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} & \frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{2}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} + \frac{2}{x_{2}^{3} \left(x_{1} + x_{2}\right)} & 0\\0 & 0 & 0\end{matrix}\right]$
and e.g. apply Sylvester's Determinant Criterion to confirm that $f$ is convex so long as each $x_i \gt 0$, i.e.
$\det\big(\mathbf H_{1:1}\big) = \frac{2}{x_{2} (x_{1} + x_{2})^{3}} \gt 0$ since each component is positive
$\det\big(\mathbf H_{2:2}\big) = \frac{3}{x_2^4(x_1 + x_2)^4}$
https://www.wolframalpha.com/input/?i=hessian+of+1/(x_2*(x_1+x_2))+
$\det\big(\mathbf H_{3:3}\big) = \det\big(\mathbf H\big) = 0$
because there is a column of all zeros
now, selecting:
$\mathbf x := \left[\begin{matrix} a \\ b \\ c\end{matrix}\right] $
and using cyclic permutation matrix
$\mathbf P = \left[\begin{matrix} 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{matrix}\right] $
noting that
$\mathbf P^0 + \mathbf P^1 + \mathbf P^2 = \left[\begin{matrix}1 & 1 & 1\\1 & 1 & 1\\1 & 1 & 1\end{matrix}\right]$
we get
$\frac{1}{\mu(\mu + \mu)}= f\Big(\frac{1}{3}\big(\mathbf P^0 + \mathbf P^1+\mathbf P^2\big)\mathbf x\Big) = f\Big(\frac{1}{3}\big(\mathbf P^0\mathbf x + \mathbf P^1\mathbf x +\mathbf P^2\mathbf x\big)\Big)$
$ \leq \frac{1}{3}\Big(f\big(\mathbf P^0\mathbf x\big) + f\big(\mathbf P^1\mathbf x\big)+ f\big(\mathbf P^2\mathbf x\big) \Big) = \frac{1}{3}\Big(\frac{1}{b(a+b)}+ \frac{1}{a(c+a)} + \frac{1}{c(b+c)}\Big) = \frac{1}{3}\Big(\frac{1}{b(a+b)} + \frac{1}{c(b+c)} + \frac{1}{a(c+a)}\Big)$
by Jensen's Inequality technical items:
i.) the components of $\mathbf H$ are rational functions so I take for granted that they vary continuously with $\mathbf x$
ii.) Sylvester's Determinant criterion might seem 'wrong' here since it technically applies when all determinants are positive and our final one is zero-- but since the first 2 leading minors are positive, it applies to that 2x2 principal submatrix and implies
that is positive definite e.g. it implies the following Cholesky factorization with blocked structure showing positive semi-definiteness for $\mathbf H$
$\mathbf H =\left[\begin{matrix}\frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} & \frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{1}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} & 0\\\frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{1}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} & \frac{2}{x_{2} \left(x_{1} + x_{2}\right)^{3}} + \frac{2}{x_{2}^{2} \left(x_{1} + x_{2}\right)^{2}} + \frac{2}{x_{2}^{3} \left(x_{1} + x_{2}\right)} & 0\\0 & 0 & 0\end{matrix}\right]= \left[\begin{matrix} \mathbf {LL}^T & 0 \\ 0 & 0 \end{matrix}\right] = \left[\begin{matrix} \mathbf {L} & 0 \\ 0 & 0 \end{matrix}\right]\left[\begin{matrix} \mathbf {L} & 0 \\ 0 & 0 \end{matrix}\right]^T \succeq 0$