I was a math major a class or two short of a physics minor in college (if I'd graduated in four years instead of three I could have done it) and all of the math courses that I took in college were upper division courses because I finished three semesters of calculus, linear algebra, discrete math, and abstract algebra before I started college, I understand the math perfectly well. I also read a dozen or more pre-prints in physics a week and have for at least six or seven years. Certainly, its been a while since I've solved a challenging differential equation or calculated a tensor product or written actual code to optimize traffic flow through a set of stoplights in a city. Law pays the bills, physics is a hobby that keeps my mind sharp. But, I understand perfectly well what the hierarchy problem is (hell, I wrote an explanation of it that was incorporated in a post on the subject at a successor blog to the one you link, Quantum Diaries Survivor). Matt Strassler sums it up this way: "Why is it at a value that is non-zero and tiny, a value that seems, at least naively, so unnatural?"
https://profmattstrassler.com/articles-and-posts/particle-physics-basics/the-hierarchy-problem/
(I'll also note his somewhat nit picky caveat and beg forgiveness for any sloppy wording: "
By the way, you will often read the hierarchy problem stated as a problem with the Higgs particle mass. This is incorrect. The problem is with how big the non-zero Higgs field is. (For experts — quantum mechanics corrects not the Higgs particle mass but the Higgs mass-squared parameter, changing the Higgs field potential energy and thus the field’s value, making it zero or immense. That’s a disaster because the W and Z masses are known. The Higgs mass is unknown, and therefore it could be very large — if the W and Z masses were very large too. So it is the W and Z masses — and the size of the non-zero Higgs field — that are the problem, both logically and scientifically.")
He notes: "Others have argued that there is nothing to explain, because of a selection effect: the universe is far larger and far more diverse than the part that we can see, and we live in an apparently unnatural part of the universe mainly because the rest of it is uninhabitable — much the way that although rocky planets are rare in the universe, we live on one because it’s the only place we could have evolved and survived."
I think he is far too timid in saying that. There is nothing to explain, not because of a selection effect, but because there is just one universe and that is the way that it is.
Naturalness is an academic disease, not a legitimate part of the scientific method. It rests on the idea that Platonic concepts of what the laws of nature could be are really things that are up for debate and are chosen by lot. But, this simply isn't a sound way to think about the ideas explored with Naturalness which at best is a concept with a poor track record in its only marginally legitimate role as a hypothesis generator.
As is commonly understood, the issue isn't that quantum corrections can't provide the mass that it does (obviously that isn't the case). It is in essence, why the huge counterterms managed to cancel out to a value many, many orders of magnitude smaller. But, it is simply a category error to think of the problem in terms of probabilities. There is just one outcome that actually happens 100% of the time. And, as long as each input is exactly right (and those inputs never change), you get the result that we see. It is fundamentally an analytical issue not a probabilistic one. The hierarchy problem is a case where we have a formula (perhaps not the most elegant or illuminating one of those possible) to give us the output and we are too thick to see why it is that all of the inputs work out in the manner that they do. If the 125 GeV mass were impossible to achieve given the terms that go into it, that would be another thing entirely. But, you can no more say that a physical constant value which is possible is "improbable" than you can say that pi should be a rational number because it is derived from dividing circumference by diameter, rather than transcendental as it is in fact.
Many aspects of quantum physics are inherently stochastic. Certainly the outputs it gives you when you ask the theory a question are of that character. But, the physical constants, both directly calculated and experimentally measured with no even hypothetical derivation, are not. Every single charged pion in the universe has a rest mass of 139.571 MeV/c^2 (subject to some conditions related to renormalization which are deterministic as well).
The more I've thought about the issue over the years, the more I've ben convinced that thinking about it in terms of probabilities like "a snowballs chance in hell" is a misleading and inappropriate way to think about the issue involving a 100% probability event.
I'm certainly not alone among those who have questioned the appropriateness of problems like this among physicists.
Sabine Hossenfelder has talked about it. Here's an excerpt from one of her most recent and thoughtful rants on the subject:
http://backreaction.blogspot.com/2016/08/the-lhc-nightmare-scenario-has-come-true.html
During my professional career, all I have seen is failure. A failure of particle physicists to uncover a more powerful mathematical framework to improve upon the theories we already have. Yes, failure is part of science – it’s frustrating, but not worrisome. What worries me much more is our failure to learn from failure. Rather than trying something new, we’ve been trying the same thing over and over again, expecting different results.
When I look at the data what I see is that our reliance on gauge-symmetry and the attempt at unification, the use of naturalness as guidance, and the trust in beauty and simplicity aren’t working. The cosmological constant isn’t natural. The Higgs mass isn’t natural. The standard model isn’t pretty, and the concordance model isn’t simple. Grand unification failed. It failed again. And yet we haven’t drawn any consequences from this: Particle physicists are still playing today by the same rules as in 1973.
For the last ten years you’ve been told that the LHC must see some new physics besides the Higgs because otherwise nature isn’t “natural” – a technical term invented to describe the degree of numerical coincidence of a theory. I’ve been laughed at when I explained that
I don’t buy into naturalness because it’s a philosophical criterion, not a scientific one. But on that matter I got the last laugh: Nature, it turns out, doesn’t like to be told what’s presumably natural.
The idea of naturalness that has been preached for so long is plainly not compatible with the LHC data, regardless of what else will be found in the data yet to come. And now that naturalness is in the way of moving predictions for so-far undiscovered particles – yet again! – to higher energies, particle physicists, opportunistic as always, are suddenly more than willing to discard of naturalness to justify the next larger collider.
Woit has talked about it. For example here:
http://www.math.columbia.edu/~woit/wordpress/?cpage=1&p=8708
Jester a.k.a. Adam Falkowski has talked about it. (Not exactly on point but acknowledging the concept's declining relevance without giving up on it at
http://resonaances.blogspot.com/2015/05/naturalness-last-bunker.html)
Gross is credited with acknowledging the failure of the naturalness paradigm, but supports SUSY anyway.
http://www.math.columbia.edu/~woit/wordpress/?p=6737
Of course, Lubos Motl is a four square supporter of the ideas of naturalness and fine tuning and has articulated his view on this subject repeatedly.
If I wracked my brain for a few hours, I could probably identify three or four more who don't blog who have looked back on the last forty years and come to the same conclusion in the last couple of years as the "Nightmare Scenario" at the LHC has come to pass.