If your initial motivation comes from quantum gravity considerations, it is true that the standard black hole argument tells you that the minimal length uncertainty is the Planck length. Indeed, one says that if the probe (photon for example) has a wavelength of the Planck length, then it would create a black hole and you would not be able to perfom any measurement. Of course, this relies on the assumption that the black hole laws (Schwardzschil metric) holds down to the Planck scale, which can not be true once you reached the conclusion that the space(time) must be discrete at the Planck scale... but let's forget that tiny loophole, and simply consider what kind of discrete space(time) structure one can have at the Planck scale.
So your Planck circle "paradox" is very similar to the issue of the Lorentz contraction. Usually, if one wants to keep a Lorentz invariant theory (such as special relativity), one will naturally get the Lorentz contraction of length for boosted observers. Then the Planck length would get squeezed to something smaller.. and thus problem..! Obviously, if we want to keep a discrete structure for lengths, we should also quantize the rapidities (speeds) allowed in the theory. At an even more basic level, if we imagine two points distant from the origin (you, the observer) from a Planck length unit lp, but such that the two directions form a 90deg angle, then the distance between them should be lp*sqrt(2). So either we must assume that the two directions can not form a 90deg angle and therefore angles should also get quantized, or that what we call distance is not as simple as this.
A first idea would be to go to a squared lattice. Then we directly realize that the length is not a integer multiple of the Planck length but that the length squared is a integer multiple of the Planck length squared by Pythagoras theorem. By the way, this fits with the LQG expectation that it is the area that is quantized. Of course, the circle remains a problem. But of course, what we call a "circle" is not really circular anymore, it is a polygon. And the 2*Pi law is only approximate. More precisely, if you define the circle as the set of points on the lattice at a certain given distance from the origin, and you define the circumference as the sum of the distance between each closest neighbour, then you will see that 2*Pi is the (never reached) superior limit of the ratio circumference/radius.
Okay, but it does not seem very satisfying to describe the beautiful space(time) as a "stupid" squared lattice... So one way to do better is to go quantum. Indeed, if we look at the spin of a particle, then we say it is quantized. What we mean is that its measured value is discrete but its expectation value can be an arbitrary real number. The same way, we can promote the length to a quantum observable and make it an operator. It could have a discrete spectrum while its expectation value can still be arbitrary. For example, we can now boost a Planck length ruler, a boosted observer (an ensemble of observers) would measure in average the usual length contraction. In practice, they would measure the Planck length again, or sometimes bigger, or actually 0 (the ruler becomes a point..). This is the route followed by non-commutative geometry.
But, at the end of the day, i think the best answer is, as someone said earlier, that when you do a length measurement, you always cut your distance in small straight pieces and always end up measuring a polygon. So 2*Pi is only an idealization anyway.