- #1
bballwaterboy
- 85
- 3
This is kind of a "dumb" question, but why do we multiply the numerators and denominators when multiplying fractions? For example:
1/5 x 2/3 = 2/15
Intuitively, I know why we need a common denominator when adding and subtracting fractions. We need to add apples to apples and oranges to oranges for it to logically make sense. But why do we suddenly not need a common denominator when multiplying fractions? Wouldn't the same analogy apply here? Don't we need to do an apples to apples kind of operation?
1/5 x 2/3 = 2/15
Intuitively, I know why we need a common denominator when adding and subtracting fractions. We need to add apples to apples and oranges to oranges for it to logically make sense. But why do we suddenly not need a common denominator when multiplying fractions? Wouldn't the same analogy apply here? Don't we need to do an apples to apples kind of operation?