butterfli said:
1) A^2 = A
2) AA = A
3) AA - A = A - A <- Here, I subtracted A from both sides
4) AA - A = 0 <- property of the Zero matrix (A+(-A) = 0)
5) A(A-I)[/color] = 0 <- I factored out A here, not sure if this is legal or not in matrices.
This derivation is perfectly good. (I fixed a minor typo, colored in red) The part you were worried about is just the distributive property -- i.e. X(Y+Z) = XY+XZ and (Y+Z)X = YX+ZX -- and matrix arithmetic does have the distributive property.
A(I-A) = 0 implies that either I-A = 0 or A = 0;
This part is not fine. This is the cancellation property -- that xy=xz implies y=z, and yx=zx implies y=z -- and matrix arithmetic does not have the cancellation property.
Counterexamples are easy to find. (try diagonal matrices; they're simple) Understanding when a product is zero is one of the interesting problems of linear algebra.
That said, your problem doesn't seem to be linear algebra -- your problem seems to be with proof, or more accurately, understanding logical statements. You seem to have gotten the meaning of "or" and "and" exactly backwards.
Error 1:
In the proof you stated above, you claimed that "A=0 or I-A=0". For the sake of argument, I'm going to assume this claim is valid. Your next step used A=0 -- but that's not correct. That wasn't what your claim was: your claim was that A=0 or I-A=0.
If your claim was "A=0 and I-A=0", then it would be correct to continue on by using A=0, because that's (part of) what the claim said.
However, the claim is "A=0 or I-A=0". If you want to use either of those terms, you have to split your proof into two cases: in one case you use A=0 and prove the thing you're trying to prove, and in the other case you use I-A=0 and prove the thing you're trying to prove. If you cannot do both of those, then you have failed to prove the thing you're trying to prove.
Error2:
The problem you're trying to solve asks you to show that "det(A)=0 or det(A)=1".
Your method was to first attempt to prove det(A)=1, and then attempt to prove that det(A)=0. However, that's wrong -- that proves "det(A)=0 and det(A)=1". (And here, it leads a contradiction*, because if det(A)=0 and det(A)=1, then 0=det(A)=1, and thus 0=1)
What you need to prove is that "det(A)=0 or det(A)=1". If you could prove det(A)=1, then you're done. ("X or Y" doesn't have to mean that X is actually possible -- if Y is always true, then "X or Y" is true, even if X is always false)
More commonly, this means somewhere in the problem, you have to split into two or more cases. In each of those cases, you prove det(A)=0 or you prove det(A)=1. If you fail to do that in any of the cases, then you have failed to prove "det(A)=0 or det(A)=1".
*: A contradiction doesn't necessarily mean your proof is wrong -- only that it's probably wrong. If your proof was right, then that means there's an error in the foundations of mathematics someplace, but if that were true, surely someone would have found it by now!