# Term to express a range of fluctuation

I am trying to make a term for a function equipped on an image sensor.
The term is to express "the upper limit of fluctuation allowance in image size which is specified in %"

The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image.

Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image.

Is any of the followings describe the concept well?
If not, what is the problem?

>- Maximum size fluctuation allowance percentage
- Maximum size volatility allowance percentage

Or,
>- Size fluctuation allowance percentage upper limit
- Size volatility allowance percentage upper limit

Or, maybe "percentage" better be "rate"?
Also, all of the candidates seem redundant. Any term to combine some words?

Related Programming and Computer Science News on Phys.org
I think you are looking for the term 'scale factor'. Most APIs I've seen would express '150%' as a scale factor of '1.5'.

Jun Kyoto
berkeman
Mentor
I am trying to make a term for a function equipped on an image sensor.
The term is to express "the upper limit of fluctuation allowance in image size which is specified in %"

The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image.

Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image.

Is any of the followings describe the concept well?
If not, what is the problem?

>- Maximum size fluctuation allowance percentage
- Maximum size volatility allowance percentage

Or,
>- Size fluctuation allowance percentage upper limit
- Size volatility allowance percentage upper limit

Or, maybe "percentage" better be "rate"?
Also, all of the candidates seem redundant. Any term to combine some words?

I'm having a hard time understanding what you are asking about. Could you perhaps give a concrete example of what you are asking about? Do you have a camera system and image processing software in mind for this question? Can you post some images that illustrate what you are asking about? Thank you.

I think you are looking for the term 'scale factor'. Most APIs I've seen would express '150%' as a scale factor of '1.5'.
Thank you ScottSalley,
Though, when I look up "scale factor" on Wikipedia, it says it is "coefficient". Does it also imply "range"? Because the substance of the concept of what I am trying to explain is "limit of range (allowance)".

Im having a hard time understanding what you are asking about. Could you perhaps give a concrete example of what you are asking about? Do you have a camera system and image processing software in mind for this question? Can you post some images that illustrate what you are asking about? Thank you.
Thank you berkeman, and sorry for the lack of context.

Yes, I do have a camera system and image processing software in mind. It is about an image sensor..

The software memorizes a sample image and then can inspect the presence of the image in another image. To find the sample image when it is a part of another image and is smaller or bigger than the size it was at the memorization, the software rescales the image internally.

User can specify the range of the rescaling by percentage because he or she maybe only wants to find the image no bigger than 130%. And what's important here is that when the user specifies 130%, it includes scaling anywhere between 100-130% (images in 101%, 122%, and 130% would all be detected.). This "user-defined range of percentage of rescaling" is what I am looking for a term for.

Last edited:
berkeman
Mentor
Thank you berkeman, and sorry for the lack of context.

Yes, I do have a camera system and image processing software in mind. The software memorizes a sample image and then can inspect the presence of the image in another image. To find the sample image when it is a part of another image and is smaller or bigger than the size it was at the memorization, the software rescales the image internally. User can specify the range of the rescaling by percentage because he or she maybe only wants to find the image no bigger than 130%. This "user-defined range of percentage of rescaling" is what I am looking for a term for.
Ah, that helps a lot. Can there be any rotation of the object, or other distortion other than just size? Is the variation in size due to the object being farther away from or closer to the camera than the original memorized image of the object? Have you looked through "Image Recognition" software documentation to see if the term you are looking for is there?

Jun Kyoto
Ah, that helps a lot. Can there be any rotation of the object, or other distortion other than just size? Is the variation in size due to the object being farther away from or closer to the camera than the original memorized image of the object? Have you looked through "Image Recognition" software documentation to see if the term you are looking for is there?
Yes, the size fluctuation we are talking about here is mainly caused by varying distance between camera and object. Although it may include that caused by other distortions. I haven't look through image recognition documentations. i should. Do you have any good online resource in mind??

Maybe, "scale factor range"? Researching farther, though

I think maximum/minimum scaling factor will do.

berkeman