Function over matrices, continuous and differentiable?

brunob
Messages
15
Reaction score
0
Hi there!

How can I prove that a function which takes an nxn matrix and returns that matrix cubed is a continuous function? Also, how can I analyze if the function is differenciable or not?

About the continuity I took a generic matrix A and considered the matrix A + h, where h is a real tending to zero. Then I generalized the product of two matrices A and B where the result is a matrix with a sum in each entry. Then the result of the product of the matrices A+h and B+h is a matrix like A.B plus some constants tending to zero. Although I'm not sure that's enough to prove the continuity.

Any help with this and the differenciation?

Thanks!
 
Physics news on Phys.org
What do you mean by ##A + h##? Are you adding the real number ##h## to every element of ##A##? If so, I don't think that is sufficient for proving continuity.

Start by specifying what matrix norm you want to use. In principle, it doesn't matter which one you use, because they all induce the same topology, hence the same continuous functions. However, some norms are easier to work with than others. For example, if you use an induced norm, then it is submultiplicative (##\|AB\| \leq \|A\|\|B\|##), which is not necessarily true for other kinds of norms. So let's assume we use an induced norm, say the induced 2-norm to be specific.

The goal is to prove that if ##\|A - B\|## is small, then ##\|A^3 - B^3\|## is small. Rather than proving this directly, consider proving something more general: if ##f## and ##g## are continuous functions, then ##fg## (the pointwise product of ##f## and ##g##) is also continuous. This, along with the fact that the identity map is clearly continuous, will give you what you want: take ##f## to be the identity map, then ##f^2## is continuous, and therefore ##f^3 = ff^2## is also continuous.

So let's try to prove that ##fg## is continuous whenever ##f## and ##g## are continuous. Fix ##A##. To show that ##fg## is continuous at ##A##, we need to show that if ##\|A - B\|## is small, then ##\|f(A)g(A) - f(B)g(B)\|## is also small. So let's work with that expression:

$$\begin{align}
\|f(A)g(A) - f(B)g(B)\| &= \|f(A)g(A) - f(A)g(B) + f(A)g(B) - f(B)g(B)\| \\
& \leq \|f(A)g(A) - f(A)g(B)\| + \|f(A)g(B) - f(B)g(B)\| \\
&= \|f(A)(g(A) - g(B))\| + \|(f(A) - f(B))g(B)\| \\
&\leq \|f(A)\|\|g(A) -g(B)\| + \|f(A) - f(B)\|\|g(B)\| \\
\end{align}$$
I'll let you finish the proof. Certainly ##\|f(A)\|## causes no problem since it is constant. You just need to argue that ##\|g(A) - g(B)\|## and ##\|f(A) - f(B)\|## decrease to zero and ##\|g(B)\|## remains bounded as ##\|A - B\|## decreases to zero.
 
Thanks for your answer! It seems to be easier than I thought.
I don't understand what you did in the last step here:

jbunniii said:
$$\begin{align}
\|f(A)g(A) - f(B)g(B)\| &= \|f(A)g(A) - f(A)g(B) + f(A)g(B) - f(B)g(B)\| \\
& \leq \|f(A)g(A) - f(A)g(B)\| + \|f(A)g(B) - f(B)g(B)\| \\
&= \|f(A)(g(A) - g(B))\| + \|(f(A) - f(B))g(B)\| \\
&\leq \|f(A)\|\|g(A) -g(B)\| + \|f(A) - f(B)\|\|g(B)\| \\
\end{align}$$
I don't realize how you get that inequation.

Also, I don't understand what you mean with this:
jbunniii said:
and ##\|g(B)\|## remains bounded as ##\|A - B\|## decreases to zero.
One more question: in the proof you are assuming that ƒ is the identity map? and g is a generic continuous function?

Thanks again!
 
brunob said:
Thanks for your answer! It seems to be easier than I thought.
I don't understand what you did in the last step here:
Do you mean this step?
$$\|f(A)(g(A) - g(B))\| + \|(f(A) - f(B))g(B)\| \leq \|f(A)\|\|g(A) -g(B)\| + \|f(A) - f(B)\|\|g(B)\|$$
I used the submultiplicative property of the induced norm: if ##M## and ##N## are any two matrices, then ##\|MN\| \leq \|M\|\|N\|##. I applied this property to both terms on the left hand side.

To see why the submultiplicative property is true for induced norms, recall the definition:
$$\|M\| = \sup_{x} \frac{\|Mx\|}{\|x\|}$$
where the supremum is taken over all nonzero vectors ##x## in the domain. Then
$$\frac{\|MNx\|}{\|Nx\|} = \frac{\|M(Nx)\|}{\|Nx\|} \leq \|M\|$$
and so, multiplying by ##\|Nx\|## we obtain
$$\|MNx\| \leq \|M\|\|Nx\|$$
Dividing by ##\|x\|## gives us
$$\frac{\|MNx\|}{\|x\|} \leq \|M\| \frac{\|Nx\|}{\|x\|} \leq \|M\| \|N\|$$
Taking the supremum of both sides, we conclude that ##\|MN\| \leq \|M\| \|N\|##.

Also, I don't understand what you mean with this:

jbunniii said:
and ##\|g(B)\|## remains bounded as ##\|A - B\|## decreases to zero.
Well, you want to show that ##\|f(A)\|\|g(A) -g(B)\| + \|f(A) - f(B)\|\|g(B)\|## decreases to zero as ##\|A - B\|## decreases to zero.

Since ##\|f(A)\|## is fixed, ##\|f(A)\|\|g(A) -g(B)\|## will decrease to zero as long as ##\|g(A) -g(B)\|## does.

But ##\|g(B)\|## is not fixed, so ##\|f(A) - f(B)\|\|g(B)\|## might not decrease to zero even if ##\|f(A) - f(B)\|## does, because ##\|g(B)\|## might grow to infinity at the same time. You need to show that this doesn't happen.

One more question: in the proof you are assuming that ƒ is the identity map? and g is a generic continuous function?
No, in the proof they are both generic continuous functions. (Continuous at ##A## to be exact.) Once you have proved the generic result, then simply set ##f = g = \text{identity}## to conclude that ##\text{identity}^2## is continuous. Then set ##f = \text{identity}## and ##g = \text{identity}^2## to conclude that ##\text{identity}^3## (i.e. the map ##A \mapsto A^3##) is continuous.
 
I made this proof about \|g(B)\| does not increase to infinit:

\|g(A) -g(B)\| decreases to zero and it's positive.
\|g(A) -g(B)\| \leq \|g(A)\| - \|g(B)\|
\|g(A)\| is constant because A was fixed, \|g(A)\| - \|g(B)\| is bounded by 0 and \|g(B)\| is positive so it can't increase to infinit, in fact \|g(B)\| is bounded by \|g(A)\|.

Is it ok ?
 
Last edited:
brunob said:
I made this proof about \|g(B)\| does not increase to infinit:

\|g(A) -g(B)\| decreases to zero and it's positive.
\|g(A) -g(B)\| \leq \|g(A)\| - \|g(B)\|
\|g(A)\| is constant because A was fixed, \|g(A)\| - \|g(B)\| is bounded by 0 and \|g(B)\| is positive so it can't increase to infinit, in fact \|g(B)\| is bounded by \|g(A)\|.
Hmm, I think you have the right idea but I didn't really follow your logic. Try writing ##g(b) = g(b) - g(a) + g(a)## and applying the triangle inequality.

P.S. wrap your tex commands like so:
Code:
##\|g(A) -g(B)\|##
for an inline expression and
Code:
$$\|g(A) -g(B)\|$$
to display it on its own line.
 
Ok, got it!
Any idea for the differentiability?

Thank you so much.
 
Some things to consider: An N\times M matrix can always be identified with an NM component vector, i.e. an NM\times 1 matrix, by simply reordering the elements. So a function of the form f:\mathbb{R}^{N\times M}\to\mathbb{R} can be identified with some function of the form f:\mathbb{R}^{NM\times 1}\to\mathbb{R}. This way the continuity and differentiability questions come back to vector calculus.

Suppose f:\mathbb{R}^N\to\mathbb{R}^N is such function that each component f_n:\mathbb{R}^N\to\mathbb{R} is continuous. Then also f is continuous, assuming that in the range the topology is defined with some norm. This is because

<br /> \|f(x)-f(x&#039;)\| \leq C\underset{1\leq n\leq N}{\textrm{max}}\; |f_n(x)-f_n(x&#039;)|<br />

holds with some constant C. The constant depends on the used norm.

If a function is defined as

<br /> f:\mathbb{R}^{N\times N}\to\mathbb{R}^{N\times N},\quad f(A)=A^3<br />

then all components f_{nn&#039;}(A) are polynomials with respect to the elements of A.

Functions defined as polynomials are continuous (at least when the topologies come from some norms).
 
Differentiability?

<br /> \frac{\partial}{\partial A_{nn&#039;}} (A^3)_{mm&#039;} = \delta_{nm}(A^2)_{n&#039;m&#039;} + A_{mn}A_{n&#039;m&#039;} + (A^2)_{mn}\delta_{n&#039;m&#039;}<br />

No problems. Don't you know how to compute partial derivatives of polynomials? :wink:
 
  • #10
jostpuur said:
Differentiability?

<br /> \frac{\partial}{\partial A_{nn&#039;}} (A^3)_{mm&#039;} = \delta_{nm}(A^2)_{n&#039;m&#039;} + A_{mn}A_{n&#039;m&#039;} + (A^2)_{mn}\delta_{n&#039;m&#039;}<br />

Sorry but I don't understand how you get that, could you explain me clearly how you did it?

Thanks.
 
  • #11
<br /> \frac{\partial}{\partial A_{nn&#039;}} (A^3)_{mm&#039;} = \sum_{k,k&#039;} \frac{\partial}{\partial A_{nn&#039;}}\big(A_{mk} A_{kk&#039;} A_{k&#039;m&#039;}\big)<br />
<br /> = \sum_{k,k&#039;}\Big(\Big(\frac{\partial}{\partial A_{nn&#039;}} A_{mk}\Big) A_{kk&#039;}A_{k&#039;m&#039;} + A_{mk}\Big(\frac{\partial}{\partial A_{nn&#039;}} A_{kk&#039;}\Big) A_{k&#039;m&#039;} + A_{mk}A_{kk&#039;}\Big(\frac{\partial}{\partial A_{nn&#039;}}A_{k&#039;m&#039;}\Big)\Big)<br />

Then use

<br /> \frac{\partial}{\partial A_{nn&#039;}} A_{ii&#039;} = \delta_{ni}\delta_{n&#039;i&#039;}<br />
 

Similar threads

Replies
9
Views
2K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
3
Views
2K
Replies
3
Views
2K
Replies
6
Views
2K
Replies
17
Views
2K
Back
Top