Orthogonal Matrices Explained: Examples & More

In summary, an orthogonal matrix is one that satisfies XX^t = Id. An orthogonal rotation matrix is a special case of a rotation matrix which fixes pointwise a codimension two subspace orthogonal to the two-plane. A rotation matrix is also a special case of a special orthogonal matrix, which fixes pointwise a codimension two subspace orthogonal to the two-plane and acts like a two-dimensional rotation orthogonal to this "axis". A block diagonal matrix with one two
  • #1
captain
164
0
what exactly are orthogonal matrices? can someone give me an example of how they would look like?
 
Physics news on Phys.org
  • #2
An orthogonal matrix is one that satisfies XX^t = Id. You can generate examples yourself very easily.
 
  • #3
The one theorem everyone should know concerning orthogonal matrices

Everyone knows that a surprising fact about three dimensional special orthogonal matrices is that they fix pointwise a one-dimensional subspace and act in the orthogonal two-dimensional subspace just like a two-dimensional rotation. But what is the n-dimensional generalization?

Consider a block diagonal matrix with one two by two block, which is an ordinary two-dimensional rotation matrix, plus ones down the diagonal. Call this a "rotation matrix"; it is a special case of a "special orthogonal matrix", i.e. an element of [itex]SO(n)[/itex]. Geometrically, it fixes pointwise a codimension two subspace orthogonal to the two-plane, and acts like a two-dimensional rotation orthogonal to this "axis". It is possible to decompose a special orthogonal matrix as a product of "rotation matrices" all respecting a particular orthogonal direct sum decomposition of the vector space [itex]R^n[/itex] into two-dimensional subspaces (with a one-dimensional pointwise fixed subspace left over in case of odd dimension). In general, two distinct special orthogonal matrices will require two distinct orthogonal direct sum decompositions; this observation generalizes the fact that two elements of [itex]SO(3)[/itex] will usually have "rotation axes" pointing in different directions. However, the elements of [itex]SO(n)[/itex] which do share such a decomposition form an abelian subgroup. This in fact gives a large conjugacy class of abelian subgroups :wink:

An interesting example: apply this idea to the permutation matrix corresponding to an n-cycle in [itex]R^n[/itex].

Once this theorem is established, it is easy to see how to modify it to obtain a similar decomposition for any element of [itex]O(n)[/itex].

I just checked two dozen books which discuss the orthogonal group, including Birkhoff and Mac Lane, A Survey of Modern Algebra (chapter 9), Armstrong, Groups and Symmetry (chapters 9 and 19), Neumann, Stoy and Thompson, Groups and Geometry (chapters 14,15), Jacobson, Basic Algebra I (chapter 9), and Artin, Geometric Algebra (chapter III) and unfortunately was unable to find any mention of this. Wikipedia doesn't mention anything like this either. Yet it is a quite well known nineteenth century theorem. Go figure...

This is perhaps the most elementary thing one can say in discussing what elements of the orthogonal group [itex]O(n)[/itex] look like and how they act on [itex]R^n[/itex]

(Edit: finally found a citation for you. The theorem is stated without proof in Senechal, Quasicrystals and Geometry, Prop 2.12, p. 47; see p. 63 for the decomposition of a five-cycle in [itex]R^5[/itex]. Geometrically speaking, the effect of this element of [itex]SO(5)[/itex] respects an orthogonal direct sum decomposition into one pointwise fixed line plus two two-dimensional subspaces; it acts like a one-fifth turn in one of these, and like a two-fifth turn in the other. By linearity this description extends to all of [itex]R^5[/itex]. Senechal cites P. Engel, Geometric Crystallography, Reidel, 1986.)
 
Last edited:
  • #4
an orthogonal real matrix is one which defines a length preserving self transformation of R^n which fixes the origin, such as a composition of rotations and reflections about fixed sets containing the origin.\The theorem classifying them is one of the few things in herstein's topics in algebra that is not in most other books.
 
Last edited:
  • #5
if orthogonal matrices are for rotation them what unitary matrices for (or unitary groups)?
 
  • #6
i don't know, since i have little intuition for complex transformations.
 
  • #7
Topics in Algebra, just what I was looking for, plus a warning

mathwonk said:
The theorem classifying them is one of the few things in herstein's topics in algebra that is not in most other books.

Saw your other post, and I loved that book when I was an undergraduate! The textbook in question, Herstein, Topics in Algebra, does indeed cover not just Jordan form but also rational form, and... hrm... oh yes, I overlooked p. 348, which has the decomposition of an element of O(n). So there you go, that's the citation I was looking for. Thanks, mathwonk!

For the OP: in reading algebra textbooks, be very careful about left versus right actions. Zillions of unwary math students have fallen victim to this notational glitch, in part because very few authors even of textbooks bother to mention the issue!

To wit: Herstein uses right actions in discussing permutations in Topics, which means that you read composition left to right. This is just what you want if you use GAP (a symbolic computation package used by many for computational group theory, ring theory, and so on), or if you are "reading" a Cayley or Schreier graph (depicting via color-coded directed edges the effect of the generating elements in an action by a finitely generated group on some set). But if you are using "composition" (fog)(x) = f(g(x)), you must read composition right to left, which means you must use left actions. That's what Herstein does in his other textbook, Abstract Algebra, one of the very best "short" modern algebra textbooks.

(I bet mathwonk can think of a well-known theorem from covering space theory which uses both right and left actions; also, there is a simple trick for converting left to right actions or vice versa, but the notational issue is genuine and not so easily evaded.)
 
Last edited:
  • #8
Unitary matrices: who needs 'em?

captain said:
if orthogonal matrices are for rotation them what unitary matrices for (or unitary groups)?

Or as Pauli once asked, wrinkling his nose at the appearance of the nuetrino, "who ordered that?" A good question!

John Baez says he plans to devote future Weeks to discussing Kleinian geometry, particularly geometries related to the classical groups, which include the orthogonal, unitary, and symplectic groups. If so, no doubt he'll give a much better answer than I can hope to provide, but here's a "cheap shot":

The unitary group U(n) is a Lie group, i.e. a smooth manifold as well as a group. In fact it is a compact Lie group, so (by an important theorem in Lie theory/measure theory) it has a unique bi-invariant probability measure, Haar measure. (Bi-invariant means it respects left and right action by the group on itself by group multiplication--- speaking of left versus right, here's another place where this distinction is actually important, yet almost all authors ignore it as I said!) This is used to model the evolution over time of a quantum system having no particular symmetries. O(n) is a subgroup of U(n) and the coset space U(n)/O(n) inherits an invariant probability measure, which is used to model the evolution over time of a quantum system having time reversal symmetry.

For you computer programmers out there, at one time or another you probably have been required to generate random elements of the unitary group. And you've probably done it wrong. See Francesco Mezzadri, "How to Generate Random Matrices from the Classical Compact Groups", Notices of the AMS 54 (2007): 592--604, from which I got the above (clearly inadequate) physical interpretation.
 
Last edited:

1. What is an orthogonal matrix?

An orthogonal matrix is a square matrix in which the columns and rows are all orthogonal (perpendicular) to each other. This means that the dot product of any two columns or rows is equal to 0.

2. How is an orthogonal matrix different from a regular matrix?

An orthogonal matrix is different from a regular matrix because it has the additional property of orthogonality. This means that the columns and rows are not only linearly independent, but also perpendicular to each other.

3. What are some examples of orthogonal matrices?

Some examples of orthogonal matrices include rotation matrices, reflection matrices, and the identity matrix. These matrices have special properties that make them useful in many mathematical and scientific applications.

4. What is the significance of orthogonal matrices?

Orthogonal matrices have many important applications in fields such as linear algebra, signal processing, and quantum mechanics. They are particularly useful for solving systems of equations, performing transformations, and preserving distances and angles.

5. How can I determine if a matrix is orthogonal?

To determine if a matrix is orthogonal, you can use the dot product test. Take the dot product of any two columns (or rows) and if the result is 0, the matrix is orthogonal. You can also check if the inverse of the matrix is equal to its transpose, which is another property of orthogonal matrices.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
199
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
13
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
985
  • Linear and Abstract Algebra
Replies
20
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
872
  • Linear and Abstract Algebra
Replies
10
Views
980
Back
Top