Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Term to express a range of fluctuation

  1. Feb 19, 2015 #1
    I am trying to make a term for a function equipped on an image sensor.
    The term is to express "the upper limit of fluctuation allowance in image size which is specified in %"

    The value of percentage does not express the ratio of the enlarged image size compared to the original image size, so for example, 150% does not mean that the sensor will only detect a 150% bigger image to the original image.

    Instead, it is to express the range of percentage of size fluctuation for sensor to accept to detect the shape (image) which makes 150% to mean that the sensor will detect an upscaled original image in all upscaling rate from 101% to 150%, such as a 102%, 117%, or 142% bigger image to the original image.

    Is any of the followings describe the concept well?
    If not, what is the problem?

    >- Maximum size fluctuation allowance percentage
    - Maximum size volatility allowance percentage

    >- Size fluctuation allowance percentage upper limit
    - Size volatility allowance percentage upper limit

    Or, maybe "percentage" better be "rate"?
    Also, all of the candidates seem redundant. Any term to combine some words?

    Thank you in advance.
  2. jcsd
  3. Feb 19, 2015 #2
    I think you are looking for the term 'scale factor'. Most APIs I've seen would express '150%' as a scale factor of '1.5'.
  4. Feb 19, 2015 #3


    User Avatar

    Staff: Mentor

    I'm having a hard time understanding what you are asking about. Could you perhaps give a concrete example of what you are asking about? Do you have a camera system and image processing software in mind for this question? Can you post some images that illustrate what you are asking about? Thank you.
  5. Feb 19, 2015 #4
    Thank you ScottSalley,
    Though, when I look up "scale factor" on Wikipedia, it says it is "coefficient". Does it also imply "range"? Because the substance of the concept of what I am trying to explain is "limit of range (allowance)".
  6. Feb 19, 2015 #5
    Thank you berkeman, and sorry for the lack of context.

    Yes, I do have a camera system and image processing software in mind. It is about an image sensor..

    The software memorizes a sample image and then can inspect the presence of the image in another image. To find the sample image when it is a part of another image and is smaller or bigger than the size it was at the memorization, the software rescales the image internally.

    User can specify the range of the rescaling by percentage because he or she maybe only wants to find the image no bigger than 130%. And what's important here is that when the user specifies 130%, it includes scaling anywhere between 100-130% (images in 101%, 122%, and 130% would all be detected.). This "user-defined range of percentage of rescaling" is what I am looking for a term for.
    Last edited: Feb 19, 2015
  7. Feb 19, 2015 #6


    User Avatar

    Staff: Mentor

    Ah, that helps a lot. Can there be any rotation of the object, or other distortion other than just size? Is the variation in size due to the object being farther away from or closer to the camera than the original memorized image of the object? Have you looked through "Image Recognition" software documentation to see if the term you are looking for is there?
  8. Feb 19, 2015 #7
    Yes, the size fluctuation we are talking about here is mainly caused by varying distance between camera and object. Although it may include that caused by other distortions. I haven't look through image recognition documentations. i should. Do you have any good online resource in mind??
  9. Feb 19, 2015 #8
    Maybe, "scale factor range"? Researching farther, though
  10. Feb 19, 2015 #9
    I think maximum/minimum scaling factor will do.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook