Orthogonal Transformations _ Benson and Grove on Finite Reflection Groups

Click For Summary

Discussion Overview

The discussion revolves around the properties of orthogonal transformations in a two-dimensional real Euclidean vector space, specifically focusing on the implications of the transformation's action on basis vectors as described in Grove and Benson's book on Finite Reflection Groups. Participants are exploring the mathematical reasoning behind specific statements regarding the transformation of basis vectors and the orthonormality of the transformation matrix.

Discussion Character

  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Peter seeks clarification on why the transformation of the second basis vector, \( T e_2 \), results in \( \pm (-\nu, \mu) \) given that \( Te_1 = (\mu, \nu) \) and \( \mu^2 + \nu^2 = 1 \).
  • Fredrik explains that the columns of an orthogonal matrix are orthonormal, which leads to the conclusion that \( \mu^2 + \nu^2 = 1 \) and that the second column must also have a norm of 1 and be orthogonal to the first column.
  • Fredrik further suggests that to prove \( T e_2 = \pm (-\nu, \mu) \), one must show that if \( (\alpha, \beta) \cdot (\mu, \nu) = 0 \) and \( |(\alpha, \beta)| = 1 \), then \( (\alpha, \beta) \) must equal \( \pm (-\nu, \mu) \).

Areas of Agreement / Disagreement

Participants are engaged in a collaborative exploration of the mathematical concepts, with no explicit consensus reached on the proof of the transformation of \( T e_2 \). The discussion remains open as Peter continues to seek clarification.

Contextual Notes

The discussion is limited to the case of \( V = \mathbb{R}^2 \) and relies on the properties of orthogonal matrices, including the orthonormality of their columns. There are unresolved steps in the proof regarding the specific transformation of the second basis vector.

Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Grove and Benson's book on Finite Reflection Groups and am struggling with some of the basic linear algebra.

Some terminology from Grove and Benson:


V is a real Euclidean vector space

A transformation of V is understood to be a linear transformation

The group of all orthogonal transformations of V will be denoted O(V)


Then in chapter 2, Grove and Benson write the following:

If T \in O(V), then T is completely determined by its action on the basis vectors e_1 = (1,0) and e_2 = (0,1).

If Te_1 = ( \mu , \nu ), then {\mu}^2 + {\nu}^2 = 1 and T e_2 = \pm ( - \nu , \mu)

Can someone please help by proving why the last statement is true?

Peter
 
Physics news on Phys.org
Since your basis vectors only have two components, I assume that we're now dealing with the case V=ℝ2.

$$\begin{pmatrix}\mu\\ \nu\end{pmatrix}=Te_1=\begin{pmatrix}T_{11} & T_{12}\\ T_{21} & T_{22}\end{pmatrix}\begin{pmatrix}1\\ 0\end{pmatrix}=\begin{pmatrix}T_{11}\\ T_{21}\end{pmatrix}$$ This is the first column of the matrix T. The columns of an orthogonal matrix are orthonormal. This is easy to see from the condition ##T^T T=1##. So the orthogonality of T implies that ##\mu^2+\nu^2=1##, and also that the second column has norm 1 and is orthogonal to the first.
 
Thanks Fredrik

Definitely dealihng with case \mathbb{R^2}

Will just reflect on what you wrote!

Still puzzling a bit about showing that T e_2 = \pm ( - \nu , \mu)

Peter
 
Math Amateur said:
Thanks Fredrik

Definitely dealihng with case \mathbb{R^2}

Will just reflect on what you wrote!

Still puzzling a bit about showing that T e_2 = \pm ( - \nu , \mu)

Peter
You know that Te2 is the second column of T, that ##(\mu,\nu)## is the first column, and that the columns are orthonormal. So you only have to prove that if ##(\alpha,\beta)\cdot (\mu,\nu)=0## and ##|(\alpha,\beta)|=1##, then ##(\alpha,\beta)=\pm(-\nu,\mu)##.
 
Thanks for the clarification Fredrik

Appreciate your help

Peter, Math Hobbyist
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K