# It is defined that for any matrix A, $A^{0}$ is defined

1. Nov 19, 2012

### Bipolarity

It is defined that for any matrix A, $A^{0}$ is defined as being equal to the identity matrix, I

My question: Does this definition also hold if A is the zero matrix?

In the real numbers, $0^{0}=1$ by definition, but I was wondering if this definition extends to matrices

Thanks.

BiP

2. Nov 19, 2012

### micromass

Staff Emeritus
Re: $0^{0}$

If the definition talks about "any matrix A", then yes, it also holds for the zero matrix.
Where did you find this definition?

Not every mathematician accepts this definition, by the way.

3. Nov 20, 2012

### HallsofIvy

Staff Emeritus
Re: $0^{0}$

I think you will find that $A^0= I$ is only defined for A NOT the 0 matrix.

If that is true "by definition", don't you have a problem with $\lim_{x\to 0} 0^x$?

$Thanks. BiP[/QUOTE] 4. Nov 20, 2012 ### Dickfore Re: [itex]0^{0}$

Wrong. The matrix needs to be non-singular.

5. Nov 20, 2012

### Bipolarity

Re: $0^{0}$

I found the matrix definition in Anton's Linear Algebra, in section 1.4 where he defines inverse matrices and proves/defines their properties. In his book he mentions any square matrix A, so I assume he also means 0 matrices since he does not exclude them from the definition. Maybe he did not notice this, so I want to confirm this.

It seems there is contoversy over this. I believe the statement that $0^{0}=1$ is verified by Google, then again Google is not a mathematician. I had this question a while ago, but it was clarified by http://www.askamathematician.com/20...ematicians-and-high-school-teachers-disagree/ whose arguments I found quite convincing.

I see no reason why $0^{0}=1$ interferes with $\lim_{x\to 0} 0^x$ since the latter need not be defined at x=0 for the limit to exist. I see no reason why mathematicians would want that function to be continuous at x=0 either...

According to the link above, the definition $0^{0} = 1$ allows for a more elegant enunciation of the binomial theorem, among a few other things.

I am confused about the question on matrices. Micro, you seem to say that the 0 exponent of any matrix (including the 0 matrix) is the identity matrix, but Ivy says it is only true for non-zero matrices as bases.

And Dickfore claims it is only true for invertible matrices, I don't know where he's getting this information.

Who is right? Is there any international authority on this subject? For example, in chemistry these conventions would be verified by the IUPAC, I was curious if there is a mathematical authority that can officially define these contentious terms.

Thanks to everyone!

BiP

Last edited: Nov 20, 2012
6. Nov 21, 2012

### Dickfore

Re: $0^{0}$

The binomial theorem relies implicitly on the commutativity of multiplication of ordinary number. That is why there is a combinatorial coefficient in front of $a^{n - k} b^{k}$, because you clump all products of the type, say:
$$a b a b a, \ a a b b a, a b a a b, \ldots$$
together.

But, matrix multiplication is non-commutative, so I don't see the benefit of defining $A^{0} = I$ to simplify the matrix binomial theorem, when the binomial theorem is not simple for non-commutative matrices in the first place.

I am starting from a general definition of a function of a (square) matrix. Namely, suppose we can diagonalize a (square) matrix:
$$A = U \, \Lambda \, U^{-1}$$
where $\Lambda$ is a diagonal matrix containing the eigenvalues of A on the main diagonal. Then, one defines:
$$f(A) = U \, f(\Lambda) \, U^{-1}$$
where $f(\Lambda)$ is simply a diagonal matrix again whose diagonal elements are the values $f(\lambda_\alpha)$ for each corresponding eigenvalue of A, and U is the same matrix as above.

Then, the problem of defining $A^0$ for a matrix goes down to the problem for defining it for a (complex) number $\lambda^0$. Taking powers of complex numbers is given by:
$$u^{v} = \exp \left(v \, \mathrm{Log}(u) \right)$$
If you take $v = 0$, it seems the argument of the exponential is always zero, and $\exp(0) = 1$. But, that is true provided that $\mathrm{Log}(u)$ exists! Otherwise the r.h.s. is not defined. And, Log is not defined for $u = 0$.

If a matrix has a zero eigenvalue, then and only then its determinant, being the product of its eigenvalues is zero as well, and the matrix is singular. This is why I required the determinant to be non-zero, so that you have only non-zero eigenvalues, and their exponentiation (even by 0) is defined.

Last edited: Nov 21, 2012
7. Nov 21, 2012

### D H

Staff Emeritus
Re: $0^{0}$

That non-commutativity of matrix multiplication is a bit of a red herring here. While matrix multiplication is not commutative in general, a positive power of some matrix $A$ does commute with the matrix $A$.

The reason for denoting $A^0=I$ for any square matrix (singular or not) is the same reason one denotes $x^0=1$. It's an extremely convenient abuse of notation.

Strictly speaking, expressing some function $f(x)$ as a power series $f(x) = \sum_{n=0}^{\infty} a_n x^n$ is incorrect because of the $0^0$ problem. To be pedantically correct, that power series should be written as $f(x) = a_0 + \sum_{n=1}^{\infty} a_n x^n$. Denoting $0^0=1$ as a convenient abuse of notation is what let's us get away with the more compact representation.

The same applies to matrices. The exponential of a square matrix $A$ is typically written as $\exp(A) = \sum_{n=0}^{\infty} \frac 1 {n!} A^n$. The pedantically correct version is $\exp(A) = I + \sum_{n=1}^{\infty} \frac 1 {n!} A^n$. The compact representation arises by denoting $A^0=I$ for all square matrices $A$, including singular ones. It's an abuse of notation, but a very convenient one.

8. Nov 21, 2012

### Erland

Re: $0^{0}$

I too think we should accept $0^0=1$ (also for zero matrices). It is true that $0^x$ becomes discontionuous at $0$ then, but this is only a minor inconvenience, IMHO.

My main reason for this is that it is logical for natural numbers, since, for positive integers $m$ and $n$, $m^n$ is the number of functions from a set with $n$ elements to a set with $m$ elements. If we want this to hold also for $m=0$ and/or $n=0$, we must put $m^0=1$ for all natural numbers $m$, including $m=0$, and $0^n=0$ for all natural numbers $n$ except $n=0$. So, $0^0=1$.

What do I mean?

Recall that a function $f$ from set $A$ to a set $B$ (more precisely, the graph of such a function), is a subset of the cartesian product $A\times B$ such that to each $a\in A$, there is exactly one $b\in B$ such that $(a,b)\in f$. We then write $f(a)=b$ instead of $(a,b)\in f$.
It is easy to see that there are $m^n$ functions from $A$ to $B$, if $A$ has $n$ elements and $B$ has $m$ elements, and $m$ and $n$ are positive integers.

Now, if $n=0$, that is, if $A$ is the empty set, $\varnothing$, then $A\times B=\varnothing\times B=\varnothing$. This set has only one subset: $\varnothing$ itself. But the condition "For each $a\in A=\varnothing$ there is exactly one $b\in B$ such that $(a,b)\in f=\varnothing$" is vacuously satisfied for $f=\varnothing$, since there are no $a\in A=\varnothing$. This means that $f=\varnothing$ is (the graph of) a function from $A=\varnothing$ to $B$, and it is the only (graph of a) function of this kind, since $A\times B=\varnothing$ has no more subsets.
Thus, there is exactly one function from a set $A$ with $0$ elements to a set $B$ with $m$ elements. It is thus logical to set $m^0=1$ for all natural numbers $m$.
Notice that the argument above is valid also for $m=0$, that is, for $B=\varnothing$. Consequently, we set $0^0=1$.

Assume instead that $m=0$ and $n>0$. Then $A\neq\varnothing$ and $B=\varnothing$. Again, we have $A\times B=A\times\varnothing=\varnothing$, and this has only one subset, $f=\varnothing$ itself. But in this case, the condition "For each $a\in A=\varnothing$ there is exactly one $b\in B$ such that $(a,b)\in f=\varnothing$" is not satisfied, since there are elements $a\in A$ but no elements $(a,b)\in f=\varnothing$. $f=\varnothing$ is therefore not a (graph of a) function from $A$ to $B=\varnothing$. Since $A\times B=\varnothing$ has no more subsets, there are no other functions of this kind either.
Thus, there are no functions from $A$ to $B=\varnothing$. It is therefore logical to set $0^n=0$ for all natural numbers $n>0$.

Last edited: Nov 21, 2012
9. Nov 21, 2012

### micromass

Staff Emeritus
Re: $0^{0}$

Bipolarity, as you can see, there is no agreement about $0^0$ among mathematicians. All in all, it is just a definition, we could define $0^0=10$ if we want, it wouldn't change much (but it's not like anybody defines it like that).

So, the definition of $0^0$ (as matrices or as numbers), depends crucially on the author. Anton seems to use the definition $0^0=1$. There is nothing wrong with that.
Other authors might leave it undefined. You have to check with each author and see what he says about it.

So there is no right and wrong answer here. It is merely a definition. So pick the definition that you like best.

10. Nov 21, 2012

### Bipolarity

Re: $0^{0}$

Thanks!

I've decided to define $0^{0} = π + e + 1$ so to make the binomial theorem complicated for future generations!

Just kidding.

BiP

11. Nov 21, 2012

### arildno

Re: $0^{0}$

"It is true that 0x becomes discontionuous at 0 then, but this is only a minor inconvenience, IMHO."

Well, what do you think the limit of x^(k(x)/ln(x)) is, as x goes to 0, the limit of k(x) as x goes to zero being finite?

Last edited: Nov 21, 2012
12. Nov 21, 2012

### Erland

Re: $0^{0}$

$e^k$. So...?

13. Nov 21, 2012

### arildno

Re: $0^{0}$

Meaning it is as silly and nonsensical to define 0^0 to have some value, as it is silly to define 0/0 to have some value.

14. Nov 21, 2012

### micromass

Staff Emeritus
Re: $0^{0}$

Uuuh, how does that follow from the limit being $e^k$?

It makes a lot of sense to define $0^0=1$. A lot of mathematicians make that definition. Saying it is silly and nonsensical is really not justified.

15. Nov 21, 2012

### Erland

Re: $0^{0}$

We have seen some examples where the definition $0^0=1$ makes sense and is useful. That suffices for me. I have not seen any examples where it is paticularly meaningful to define 0/0 as a particular value.

16. Nov 21, 2012

### arildno

Re: $0^{0}$

You might as well define it to be 2.
You might even find some uses for that as well.

17. Nov 21, 2012

### micromass

Staff Emeritus
Re: $0^{0}$

Sure.

I doubt you'll find much uses. Defining $0^0=1$ is really useful and it simplifies a lot of formulas. It is obvious that it is much more useful than defining it to be 2.

18. Nov 21, 2012

### arildno

Re: $0^{0}$

And in those formulas, you evidently provide the EXPLICIT definition of 0^0 being..1, thus making it a particular case valid for that formula, rather than any general result or definition.

19. Nov 21, 2012

### micromass

Staff Emeritus
Re: $0^{0}$

When somebody writes down the binomial theorem, then I have never seen him say explicitely that we assume $0^0=1$.

But in general, you certainly do make a general definition.
If $\kappa$ and $\kappa^\prime$ are cardinal numbers and if X is a set with cardinality $\kappa$ and Y has cardinality $\kappa^\prime$. Then we say by definition that $X^Y$ has cardinality $\kappa^{\kappa^\prime}$.

If you take in that definition $\kappa=\kappa^\prime=0$. Then by definition, the cardinal $0^0$ is the size of the set $\emptyset^\emptyset$. This size is one.
So at least for cardinal numbers, we do make the explicit and general definition $0^0=1$.