How Does the Toeplitz-Hausdorff Theorem Apply to Convexity in Linear Operators?

  • Thread starter Thread starter Bashyboy
  • Start date Start date
  • Tags Tags
    Theorem
Click For Summary
The discussion centers on the application of the Toeplitz-Hausdorff theorem to demonstrate the convexity of the set defined by linear operators. Participants explore the relationship between the convexity of the set W(A) and the transformed set W(μA + γI), emphasizing the importance of linearity and normalization of vectors. The proof involves showing that specific transformations allow for the existence of vectors x₀ and x₁ that satisfy certain conditions, thereby reducing the problem to a simpler case. Clarifications are made regarding the definitions of constants c₀ and c₁, which are derived from the inner products involving the operator A. The conversation concludes with the understanding that affine transformations do not alter the convexity of the sets involved.
Bashyboy
Messages
1,419
Reaction score
5

Homework Statement


Here is a link to the paper I am working through: http://www.ams.org/journals/proc/1970-025-01/S0002-9939-1970-0262849-9/S0002-9939-1970-0262849-9.pdf

Homework Equations

The Attempt at a Solution


[/B]
I am working on the first line of the proof. This is what I thus far understand: First they are relying on the fact that ##W(A)## is convex if and only if ##W(\mu A + \gamma I)##. Here is where I am unsure of things. I believe the first sentence is saying that we can stretch (or contract) the set by ##\mu## amount and translate it by ##\gamma## amount so that there exist vectors ##x_0## and ##x_1## such that ##\langle (\mu A + \gamma I)x_0,x_0 \rangle = 0## and ##\langle (\mu A + \gamma I)x_1,x_1 \rangle = 1## If this is true, then the problem can to reduce to assuming that we have an operator ##A## such that ##\langle Ax_0,x_0 \rangle = 0## and ##\langle Ax_1,x_1 \rangle = 1##.

Is that a correct interpretation? The reason I ask is, because I am interested in justifying this step, and I want to know precisely what I am proving.
 
Last edited:
Physics news on Phys.org
Bashyboy said:
I am working on the first line of the proof. This is what I thus far understand: First they are relying on the fact that ##W(A)## is convex if and only if ##W(\mu A + \gamma I)##.
No. It relies on the fact that ##A## is linear and ##||x|| = 1##.
Set ## <Ax'_0,x'_0> = c_0 \cdot <x'_0,x'_0> = c_0 ## and ## <Ax'_1,x'_1> = c_1 \cdot <x'_1,x'_1> = c_1 ## then you can define ##μ = (c_1 - c_0)^{-1}## and ##γ =c_0 (c_0 - c_1)^{-1}## to get the points ##x_0## and ##x_1##.
 
Bashyboy said:

Homework Statement


Here is a link to the paper I am working through: http://www.ams.org/journals/proc/1970-025-01/S0002-9939-1970-0262849-9/S0002-9939-1970-0262849-9.pdf

Homework Equations

The Attempt at a Solution


[/B]
I am working on the first line of the proof. This is what I thus far understand: First they are relying on the fact that ##W(A)## is convex if and only if ##W(\mu A + \gamma I)##. Here is where I am unsure of things. I believe the first sentence is saying that we can stretch (or contract) the set by ##\mu## amount and translate it by ##\gamma## amount so that there exist vectors ##x_0## and ##x_1## such that ##\langle (\mu A + \gamma I)x_0,x_0 \rangle = 0## and ##\langle (\mu A + \gamma I)x_1,x_1 \rangle = 1## If this is true, then the problem can to reduce to assuming that we have an operator ##A## such that ##\langle Ax_0,x_0 \rangle = 0## and ##\langle Ax_1,x_1 \rangle = 1##.

Is that a correct interpretation? The reason I ask is, because I am interested in justifying this step, and I want to know precisely what I am proving.
Take arbitrary but different ##(Ax_1,x_1)## and ##(Ax_2,x_2)## in ##W(A)##.
You can easily find ##\mu## and ##\gamma## such that ##((\mu A+\gamma I)x_1,x_1)=0## and ##((\mu A+\gamma I)x_2,x_2)=1##.
As ##\mu W(A)+\gamma=W(\mu A + \gamma I)##, showing that the straight line segment joining ##((\mu A+\gamma I)x_1,x_1)## and ##((\mu A+\gamma I)x_2,x_2)## lies in ##W(\mu A + \gamma I)## (the convexity condition), will prove the similar statement for ##(Ax_1,x_1)## and ##(Ax_2,x_2)## in ##W(A)##.

EDIT: fresh_42 was faster. :)
 
fresh_42 said:
No. It relies on the fact that ##A## is linear and ##||x|| = 1##.
Set ## <Ax'_0,x'_0> = c_0 \cdot <x'_0,x'_0> = c_0 ## and ## <Ax'_1,x'_1> = c_1 \cdot <x'_1,x'_1> = c_1 ## then you can define ##μ = (c_1 - c_0)^{-1}## and ##γ =c_0 (c_0 - c_1)^{-1}## to get the points ##x_0## and ##x_1##.

Samy_A said:
Take arbitrary but different ##(Ax_1,x_1)## and ##(Ax_2,x_2)## in ##W(A)##.
You can easily find ##\mu## and ##\gamma## such that ##((\mu A+\gamma I)x_1,x_1)=0## and ##((\mu A+\gamma I)x_2,x_2)=1##.
As ##\mu W(A)+\gamma=W(\mu A + \gamma I)##, showing that the straight line segment joining ##((\mu A+\gamma I)x_1,x_1)## and ##((\mu A+\gamma I)x_2,x_2)## lies in ##W(\mu A + \gamma I)## (the convexity condition), will prove the similar statement for ##(Ax_1,x_1)## and ##(Ax_2,x_2)## in ##W(A)##.

EDIT: fresh_42 was faster. :)

These two posts appear to contradict each other, but hopefully one will correct me if I am wrong. Samy_A appears to be saying that we are indeed relying on the fact that ##\mu W(A) + \gamma = W(\mu A + \gamma I)## is convex iff ##W(A)##

fresh_42: Why is ##<Ax'_0,x'_0> = c_0 \cdot <x'_0,x'_0>## true? Is this some property linear operators? I ask, because I couldn't find any such property in my searching on the internet.
 
Last edited:
The way I understand their proof is as follows.
They must show that the straight line segment joining ##(Ax_1,x_1)## and ##(Ax_2,x_2)## lies in ##W(A)## (here both vectors have norm 1).

1) They prove it for the case that ##(Ax_1,x_1)=0## and ##(Ax_2,x_2)=1##.
2a) They note that for the general case, one can find ##\gamma, \mu## such that ##((\mu A+\gamma I)x_1,x_1)=0##, ##((\mu A+\gamma I)x_2,x_2)=1##. Applying 1) to the operator ##\mu A+\gamma I##, it follows that the straight line segment joining ##((\mu A+\gamma I)x_1,x_1)## and ##((\mu A+\gamma I)x_2,x_2)## lies in ##W(\mu A+\gamma I)##.
2b) ##((\mu A+\gamma I)x_1,x_1)=\mu (Ax_1,x_1)+\gamma##, ##((\mu A+\gamma I)x_2,x_2)=\mu (Ax_2,x_2)+\gamma##. It follows from 2a) that the straight line segment joining ##(Ax_1,x_1)## and ##(Ax_2,x_2)## lies in ##W(A)##.

The difference between this outline and the paper is that they mention 2a) and 2b) first, and based on that they claim that it is sufficient to prove the particular case 1).
 
Bashyboy said:
These two posts appear to contradict each other, but hopefully one will correct me if I am wrong. Samy_A appears to be saying that we are indeed relying on the fact that ##\mu W(A) + \gamma = W(\mu A + \gamma I)## is convex iff ##W(A)##

fresh_42: Why is ##<Ax'_0,x'_0> = c_0 \cdot <x'_0,x'_0>## true? Is this some property linear operators? I ask, because I couldn't find any such property in my searching on the internet.

They don't contradict each other.

What I tried to show is the explicit reduction from the general assertion to the proved statement. From arbitrary ##x'_i## to those with the desired properties.
##<Ax'_i,x'_i> = c_i \cdot <x'_i,x'_i>## is not "true". I defined the ##c_i## by it in order to find actual values for ##μ## and ##γ## (step (1) of Samy's answer).
I was simply too lazy to type the fraction syntax into it: ##c_i := \frac{<Ax'_i,x'_i>}{<x'_i,x'_i>}## - Thank you for forcing me to do it anyway :wink:. And I hopefully made no calculation error.

Samy further mentioned that the convexity condition doesn't change by affine transformations. They don't affect convexity since straight lines, which are used to define convexity, will undergo the same affine transformation staying straight lines.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
9
Views
2K
  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 48 ·
2
Replies
48
Views
12K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
8
Views
3K
Replies
7
Views
14K