I am trying to make a term for a function equipped on an image sensor. The term is to express "the upper limit of fluctuation allowance in image size which is specified in %" The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image. Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image. Is any of the followings describe the concept well? If not, what is the problem? >- Maximum size fluctuation allowance percentage - Maximum size volatility allowance percentage Or, >- Size fluctuation allowance percentage upper limit - Size volatility allowance percentage upper limit Or, maybe "percentage" better be "rate"? Also, all of the candidates seem redundant. Any term to combine some words? Thank you in advance.