I have a question about a seemingly illogical and strange aspect of multiplication and unit conversion that I have never noticed until now. It concerns the issue of finding the area of a square/rectangle when the length and width are expressed as decimals/fractions. Ordinarily, when you find the area of a square, the area is a bigger number than the lengths of the sides that are multiplied. That makes perfect sense; if you multiply two numbers that are both greater than one, you get a bigger number. But when the lengths are expressed as decimals, they end up being bigger than the area (product). That still makes sense to me because you are actually finding a percentage of a percentage. What really puzzles me is when you convert to different units of measurement. For example a square that is .5 x .5 inches, which equals .25 square inches. If you wanted to convert it to centimeters, you would multiply .25 by 2.54 x 2.54, which would give you 1.6129 square centimeters. Now if I wanted to find the original lengths of the sides in centimeters, I would take the square root of the area which would give me 1.27 x 1.27 centimeters. To be more concise, this is what I have: .5 x .5 = .25 square inches 1.27 x 1.27 = 1.6129 square centimeters What puzzles me is the fact that when using inches, the lengths of the sides are bigger than the area. But when you switch to centimeters, the sides are smaller than the area, just as you would expect when dealing with numbers that are greater than one. So you have a reversal of the usual behavior of numbers in my example, and yet they are still equivalent. Somehow it works out, but it doesn't make sense to me how that is possible. What is going on here? Am I missing something?