I know why it would equal to 0 if it was (0*0). But what about an actual number? Why does (100*0) equal to 0? You're not multiplying anything, but shouldn't it still equal to 100? If I have 100 cookies on the table, and I don't multiply it by anything, why do I suddenly have zero cookies on the table? I'm just trying to gain a conceptual understanding behind the zero-factor algebraic property.