## [SOLVED] Pseudo orthogonal group

Arnold Neumaier wrote:

> In particular, SO(p,q) is connected if the product pq is even,
> and has two connected components otherwise.

This is not true. SO(p,q) has always two connected components if pq>0.
At least with the standard definition of SO(p,q) that I repeat in
another message on this thread. In fact, it is very easy to see that
there are always *at least* two connected components; cf. my other message.

> The book by
> R. Gilmore,
> Lie groups, Lie algebras, and some of their applications
> Wiley, New York 1974
> contains a lot of material about specific classical groups.
> The above result is stated there (without proof) on p. 199.

I haven't looked it up. Maybe a clash of notation?

-- Marc Nardmann

 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 Arnold Neumaier wrote: > In particular, SO(p,q) is connected if the product pq is even, > and has two connected components otherwise. This is not true. SO(p,q) has always two connected components if pq>0. At least with the standard definition of SO(p,q) that I repeat in another message on this thread. In fact, it is very easy to see that there are always *at least* two connected components; cf. my other message. > The book by > R. Gilmore, > Lie groups, Lie algebras, and some of their applications > Wiley, New York 1974 > contains a lot of material about specific classical groups. > The above result is stated there (without proof) on p. 199. I haven't looked it up. Maybe a clash of notation? -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann
 I wrote: >Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1- >F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w >for some x in R^q and some w in R^p which is orthogonal to the image of >F. Let G denote the positive semidefinite matrix F'F. We have to show that > > - > >is positive if v is nonzero. Since F'w = 0, we obtain > > - = + - , > >so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is positive if >x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then > >0 < + > = > = > = > = - , > >as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite. > > Oops, not quite. What I should have said is this: > [...] so it suffices to prove that - <(1+G)^{-1}Gx, Gx> is > positive if > Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero > because = is nonzero. Hence > > = = <(1+G)^{-1}Gx,Gx> > 0 , > > so Gy is nonzero. Thus > > 0 < + = + > > [...] Near the end of the message, there was a typo: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence > should be: >C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence > -- Marc Nardmann