A strange inconsistency when calculating area with decimals

  • #1
114
0
I have a question about a seemingly illogical and strange aspect of multiplication and unit conversion that I have never noticed until now. It concerns the issue of finding the area of a square/rectangle when the length and width are expressed as decimals/fractions. Ordinarily, when you find the area of a square, the area is a bigger number than the lengths of the sides that are multiplied. That makes perfect sense; if you multiply two numbers that are both greater than one, you get a bigger number. But when the lengths are expressed as decimals, they end up being bigger than the area (product). That still makes sense to me because you are actually finding a percentage of a percentage. What really puzzles me is when you convert to different units of measurement. For example a square that is .5 x .5 inches, which equals .25 square inches. If you wanted to convert it to centimeters, you would multiply .25 by 2.54 x 2.54, which would give you 1.6129 square centimeters. Now if I wanted to find the original lengths of the sides in centimeters, I would take the square root of the area which would give me 1.27 x 1.27 centimeters. To be more concise, this is what I have:

.5 x .5 = .25 square inches

1.27 x 1.27 = 1.6129 square centimeters

What puzzles me is the fact that when using inches, the lengths of the sides are bigger than the area. But when you switch to centimeters, the sides are smaller than the area, just as you would expect when dealing with numbers that are greater than one. So you have a reversal of the usual behavior of numbers in my example, and yet they are still equivalent. Somehow it works out, but it doesn't make sense to me how that is possible. What is going on here? Am I missing something?
 

Answers and Replies

  • #2
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
You cannot simply compare the numbers as this is a meaningless exercise. In both cases, the numbers come with units and the units are important. An area does not have the same physical dimension as a length and it is therefore completely arbitrary to try to compare them.
 
  • #3
34,681
10,825
Ordinarily, when you find the area of a square, the area is a bigger number than the lengths of the sides that are multiplied.
Only if the lengths are longer than the length unit. A completely arbitrary comparison.
the lengths of the sides are bigger than the area.
Comparing the numbers has no proper meaning. It's like saying "3 apples is more than 2 kilometers because 3 is larger than 2".
 
  • #4
34,017
5,672
To be more concise, this is what I have:

.5 x .5 = .25 square inches

1.27 x 1.27 = 1.6129 square centimeters

What puzzles me is the fact that when using inches, the lengths of the sides are bigger than the area. But when you switch to centimeters, the sides are smaller than the area, just as you would expect when dealing with numbers that are greater than one. So you have a reversal of the usual behavior of numbers in my example, and yet they are still equivalent. Somehow it works out, but it doesn't make sense to me how that is possible. What is going on here? Am I missing something?
This has nothing to do with the units, and is purely a result of the numbers involved. For the first area, the side lengths are less than 1 (inch). If you square a number less than 1, you get a result that is smaller than the number being squared. For the second area, the side length is greater than 1. If you square a number that is larger than 1, the result is larger than the number being squared. That's all that is going on.
 

Related Threads on A strange inconsistency when calculating area with decimals

Replies
9
Views
1K
Replies
16
Views
1K
Replies
10
Views
6K
  • Last Post
Replies
5
Views
5K
  • Last Post
Replies
15
Views
5K
Replies
19
Views
3K
Replies
6
Views
2K
Replies
2
Views
581
  • Last Post
Replies
1
Views
1K
Top