- #1
devinedj
- 15
- 0
An example:
Both conceptually and computationally it is easier to see that
2.5 * 5.2 = 13
than it is to see
13 / 5.2 = 2.5
In programming too: division loops take more CPU resources than multiplication loops.
Why is this?
Both conceptually and computationally it is easier to see that
2.5 * 5.2 = 13
than it is to see
13 / 5.2 = 2.5
In programming too: division loops take more CPU resources than multiplication loops.
Why is this?