- #1
pyroknife
- 613
- 4
I'm reading this programming code that basically says if 1 of 2 variables (x and y) is less than 0, then their least common multiple is 0.
If we have x=-2 and y=+3, then their least common multiple is 0.
I don't get it. I know what a least common multiple is, but this one isn't very intuitive to me.
Can someone explain?
If we have x=-2 and y=+3, then their least common multiple is 0.
I don't get it. I know what a least common multiple is, but this one isn't very intuitive to me.
Can someone explain?