Existence of Directional Derivative in Normed Linear Space

Click For Summary
SUMMARY

The discussion centers on the existence of a directional derivative in finite-dimensional normed linear spaces, specifically questioning whether for every point \( x_0 \in L \), there exists a direction \( \delta \in L \) such that \( \lVert x_0 + t\delta \rVert \geqslant \lVert x_0 \rVert \) for all \( t \in \mathbb{R} \). Participants conclude that this is not universally true, especially when considering norms that are not differentiable, such as the supremum norm \( p = \infty \). The conversation also touches on the properties of singular matrices and the continuity of determinants, indicating that the density of regular matrices is norm-dependent.

PREREQUISITES
  • Understanding of finite-dimensional normed linear spaces
  • Familiarity with concepts of directional derivatives
  • Knowledge of matrix theory, particularly singular and regular matrices
  • Basic principles of continuity in mathematical functions
NEXT STEPS
  • Study the properties of directional derivatives in various norms
  • Explore the implications of the Hahn-Banach theorem in normed spaces
  • Investigate the continuity of determinants and its relation to singular matrices
  • Learn about convex analysis and its applications in normed linear spaces
USEFUL FOR

Mathematicians, students of functional analysis, and anyone interested in the properties of normed linear spaces and matrix theory.

Gear300
Messages
1,209
Reaction score
9
Given a finite-dimensional normed linear space ##(L,\lVert \cdot \rVert)##, is there anything that suggests that at every point ##x_0 \in L##, there exists a direction ##\delta \in L## such that that ##\lVert x_0 + t\delta \rVert \geqslant \lVert x_0 \rVert## for all ##t \in \mathbb{R}##?
 
Physics news on Phys.org
Gear300 said:
Given a finite-dimensional normed linear space ##(L,\lVert \cdot \rVert)##, is there anything that suggests that at every point ##x_0 \in L##, there exists a direction ##\delta \in L## such that that ##\lVert x_0 + t\delta \rVert \geqslant \lVert x_0 \rVert## for all ##t \in \mathbb{R}##?
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
 
fresh_42 said:
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
Point taken. But what if the dimension of the space is ##n \geqslant 2##? I figured there should be ##n - 1## such directions. The reason I ask is because of the attached question. Finding a minimum matrix ##B## has been simple enough for p-like norms, but I haven't found one for arbitrary norms in general. Proving the posted statement felt it should suffice to prove the inquiry. If the norm was differentiable, then I could take directions orthogonal to the gradient and inspect the Hessian, but I suspect differentiability is not granted for all ##x_0 \in L##, such as with the supremum norm ##p = \infty##.
 

Attachments

  • pic1.png
    pic1.png
    9.9 KB · Views: 582
Last edited:
Gear300 said:
Point taken. But what if the dimension of the space is ##n \geqslant 2##? I figured there should be ##n - 1## such directions.
Same example: ##||(2,2) - (1,1)||= \sqrt{2} < 2\sqrt{2} = ||(2,2)||\,.##
The reason I ask is because of the attached question. Finding a minimum matrix ##B## has been simple enough for p-like norms, but I haven't found one for arbitrary norms in general. Proving the posted statement felt it should suffice to prove the inquiry. If the norm was differentiable, then I could take directions orthogonal to the gradient and inspect the Hessian, but I suspect differentiability is not granted for all ##x_0 \in L##, such as with the supremum norm ##p = \infty##.
The point is, that regular matrices form a dense subset, so the distance to another regular one is arbitrary small, but to find the next singular one is not as trivial. The singular matrices form a closed subset and the condition number says how they are distributed. As we measure a distance, this depends a lot on the norm.
 
fresh_42 said:
The point is, that regular matrices form a dense subset, so the distance to another regular one is arbitrary small, but to find the next singular one is not as trivial. The singular matrices form a closed subset and the condition number says how they are distributed. As we measure a distance, this depends a lot on the norm.
Might this be provable by inspecting the continuity of the determinant as a polynomial function? The zeros would then correspond to a closed subset of singular matrices. I'm just wondering how this is normally proved.
 
Gear300 said:
Might this be provable by inspecting the continuity of the determinant as a polynomial function?
Yes. The singular matrices are the preimage of the closed set ##\{\,0\,\}## under the determinant.
The zeros would then correspond to a closed subset of singular matrices. I'm just wondering how this is normally proved.
 
  • Like
Likes   Reactions: Klystron
fresh_42 said:
Yes. The singular matrices are the preimage of the closed set ##\{\,0\,\}## under the determinant.
Alright then. You don't mind recommending a book on this :biggrin:? The book I'm using here is Rainer Kress's Numerical Analysis, and I've gotten up to exactly that question with what I know about Banach spaces. So I wouldn't mind additional references.
 
Gear300 said:
So that's how it is normally proved? (Apologies for the persistence, but just being sure.)
Well, we need the determinant to be continuous, and that ##\{\,0\,\} \subseteq \mathbb{R}## is closed. Continuity is equivalent to "preimages of closed sets are closed". So ##\{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M = 0\,\} \subseteq \mathbb{M}_n(\mathbb{R})## is closed. This means at the same time, that ##GL_n(\mathbb{R}) = \{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M \neq 0\,\} ## is open. Density might be dependent on the norm which is used, but usually if we change a few matrix entries by arbitrary small amounts, we will always get a regular matrix. Maybe it won't work with the discrete norm.
 
fresh_42 said:
Well, we need the determinant to be continuous, and that ##\{\,0\,\} \subseteq \mathbb{R}## is closed. Continuity is equivalent to "preimages of closed sets are closed". So ##\{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M = 0\,\} \subseteq \mathbb{M}_n(\mathbb{R})## is closed. This means at the same time, that ##GL_n(\mathbb{R}) = \{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M \neq 0\,\} ## is open. Density might be dependent on the norm which is used, but usually if we change a few matrix entries by arbitrary small amounts, we will always get a regular matrix. Maybe it won't work with the discrete norm.
An ##n \times n## matrix lives in a linear space of dimension ##n^2##. Since all finite-dimensional norms are equivalent, we can say that all matrix norms are equivalent to the Frobenius norm.

That aside, like you intimated, I think my assertion is true. Funny it took me so long, but for a norm function ##p(x)##, we can consider the surface of constant norm ##p(x_0)##. Since it is convex, we should always be able to come up with a tangent hyperplane of dimension ##n-1## that satisfies the desired properties, correct?
 
  • #10
I still doubt that it works for all ##t\in \mathbb{R}##, but I'm not sure. You claim, that each point is a norm minimum for at least one direction, but with arbitrary ##t##, these are two opposite directions. I cannot really imagine such a situation except for ##x_0=0##.

There is a separation theorem for hypersurfaces in Hilbert spaces IIRC, but we don't have a norm induced by an inner product. I might have thought too much of a metric, which is too weak.
 
  • #11
fresh_42 said:
I still doubt that it works for all ##t\in \mathbb{R}##, but I'm not sure. You claim, that each point is a norm minimum for at least one direction, but with arbitrary ##t##, these are two opposite directions. I cannot really imagine such a situation except for ##x_0=0##.

There is a separation theorem for hypersurfaces in Hilbert spaces IIRC, but we don't have a norm induced by an inner product. I might have thought too much of a metric, which is too weak.
So I did a recourse through Kolmogorov's Real Analysis text and found Problem 5.c on pg 141. The interior of the unit sphere in any normed space is an open convex set, so for all points on the surface, a tangent hyperplane should exist in the sense that it never touches the interior.
 

Attachments

  • 128-129.png
    128-129.png
    57.3 KB · Views: 525
  • 130-131.png
    130-131.png
    47 KB · Views: 527
  • 132-133.png
    132-133.png
    63.8 KB · Views: 513
  • 134-135.png
    134-135.png
    67.1 KB · Views: 527
  • 136-137.png
    136-137.png
    61.2 KB · Views: 518
  • 138-139.png
    138-139.png
    58.4 KB · Views: 519
  • 140-141.png
    140-141.png
    60.4 KB · Views: 511
  • #12
fresh_42 said:
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
I don't understand, doesn't he op ask for _a_ direction and not for _every_ direction? Then you canonsider || 2+1(1)||=||3||>2?
 
  • #13
Gear300 said:
An ##n \times n## matrix lives in a linear space of dimension ##n^2##. Since all finite-dimensional norms are equivalent, we can say that all matrix norms are equivalent to the Frobenius norm.

That aside, like you intimated, I think my assertion is true. Funny it took me so long, but for a norm function ##p(x)##, we can consider the surface of constant norm ##p(x_0)##. Since it is convex, we should always be able to come up with a tangent hyperplane of dimension ##n-1## that satisfies the desired properties, correct?
I think this is one of the versions of Hahn-Banach.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
2K
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K