Regular representations of finite dimensional algebras

Click For Summary

Discussion Overview

The discussion revolves around the concept of regular representations of finite-dimensional algebras, particularly focusing on examples and clarifications related to the algebra of complex numbers as a two-dimensional algebra over the reals. Participants explore the implications of linear transformations, matrix representations, and specific exercises from a textbook on ring theory.

Discussion Character

  • Exploratory
  • Technical explanation
  • Homework-related

Main Points Raised

  • One participant seeks a simple example of representations of algebras, specifically referencing the complex numbers as a two-dimensional algebra over the reals.
  • Another participant explains the right multiplication in the context of complex numbers and provides a matrix representation corresponding to this operation.
  • There is a question about how a specific mapping $$\rho \ : \ A \to k_n$$ arises from the linear transformation defined by right multiplication in a finite-dimensional algebra.
  • A participant discusses the standard method for associating a matrix with a linear transformation and notes a potential confusion regarding the representation of the matrix.
  • Clarifications are made regarding the notation and the construction of the matrix representation, including a note about the orientation of rows and columns in the matrix.

Areas of Agreement / Disagreement

Participants express varying degrees of understanding regarding the examples and the notation used in the text. Some participants provide clarifications, while others raise questions about the implications of the definitions and representations, indicating that the discussion remains unresolved on certain points.

Contextual Notes

There are mentions of potential confusion regarding the orientation of matrices in relation to left and right regular representations, as well as the need for a basis to construct the matrix representation of linear transformations.

Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading "Introduction to Ring Theory" by P. M. Cohn (Springer Undergraduate Mathematics Series)

In Chapter 2: Linear Algebras and Artinian Rings we read the following on page 57:
https://www.physicsforums.com/attachments/3149I am trying to gain an understanding of representations. I would welcome a simple example of representations of algebras as this would help a great deal …Further, Exercise 2.1 (4) reads as follows:

"Verify that the regular representation (2.10 - see above text) is a homomorphism of $$A$$ as a right $$A$$-module.

How is the matrix $$( \rho_{ij} (a) )$$ affected by a change of basis?Can someone please help me get started on this problem?Peter
 
Last edited:
Physics news on Phys.org
Peter said:
I am trying to gain an understanding of representations. I would welcome a simple example of representations of algebras as this would help a great deal …
One very simple example would be the complex numbers, considered as a two-dimensional algebra over the field of real numbers.

If we take $\{1,i\}$ as a basis for $\mathbb{C}$ then every element of $\mathbb{C}$ can be expressed in the form $x+iy$ with $x,y \in\mathbb{R}$. Given an element $a+ib \in\mathbb{C}$, the right multiplication $\rho_{a+ib}$ is the map $x+iy \mapsto (x+iy)(a+ib) = (xa-yb) + i(xb+ya)$.

Now use the basis $\{1,i\}$ to represent the complex number $x+iy$ by the vector $[x,y]$. Then the equation $\rho_{a+ib}(x+iy) = (xa-yb) + i(xb+ya)$ corresponds to the matrix equation $$[xa-yb,\,xb+ya] = [x,\,y]\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$$ Thus the right regular representation of $\mathbb{C}$ (with respect to the basis $\{1,i\}$) takes the complex number $a+ib$ to the $2\times2$ matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$

To verify that the regular representation is a multiplicative mapping in this example, notice that the product of complex numbers $(a+ib)(c+id) = (ac-bd) + i(ad+bc)$ corresponds to the matrix equation $\begin{bmatrix}a&b\\ -b&a \end{bmatrix} \begin{bmatrix}c&d\\ -d&c \end{bmatrix} = \begin{bmatrix}ac -bd&ad+bc\\ -(ad+bc)&ac-bd \end{bmatrix} .$
 
Last edited:
Opalg said:
One very simple example would be the complex numbers, considered as a two-dimensional algebra over the field of real numbers.

If we take $\{1,i\}$ as a basis for $\mathbb{C}$ then every element of $\mathbb{C}$ can be expressed in the form $x+iy$ with $x,y \in\mathbb{R}$. Given an element $a+ib \in\mathbb{C}$, the right multiplication $\rho_{a+ib}$ is the map $x+iy \mapsto (x+iy)(a+ib) = (xa-yb) + i(xb+ya)$.

Now use the basis $\{1,i\}$ to represent the complex number $x+iy$ by the vector $[x,y]$. Then the equation $\rho_{a+ib}(x+iy) = (xa-yb) + i(xb+ya)$ corresponds to the matrix equation $$[xa-yb,\,xb+ya] = [x,\,y]\begin{bmatrix}a&-b\\ b&a \end{bmatrix}.$$ Thus the right regular representation of $\mathbb{C}$ (with respect to the basis $\{1,i\}$) takes the complex number $a+ib$ to the $2\times2$ matrix $\begin{bmatrix}a&-b\\ b&a \end{bmatrix}.$

To verify that the regular representation is a multiplicative mapping in this example, notice that the product of complex numbers $(a+ib)(c+id) = (ac-bd) + i(ad+bc)$ corresponds to the matrix equation $\begin{bmatrix}a&-b\\ b&a \end{bmatrix} \begin{bmatrix}c&-d\\ d&c \end{bmatrix} = \begin{bmatrix}ac -bd&-(ad+bc)\\ ad+bc&ac-bd \end{bmatrix} .$

Thanks so much for the helpful example, Opalg ...

Just working through the detail now …

Thanks again … examples are so helpful!

Peter
 
Peter said:
Thanks so much for the helpful example, Opalg ...

Just working through the detail now …

Thanks again … examples are so helpful!

Peter
A further question I have regarding representations is as follows:In the text above we find the following text:"Given a finite-dimensional algebra $$A$$ over a field $$k$$, of dimension $$n$$, say, consider the right multiplication by an element $$a \in A$$:

$$\rho_a \ : \ x \mapsto xa $$ where $$x \in A$$ … … … (2.10)

This is a linear transformation of $$A$$ and so can be represented by an $$n \times n$$ matrix over $$k$$. Thus we have a mapping $$\rho \ : \ A \to k_n$$, and this is easily seen to be a homomorphism. … … "
I cannot see how (2.10) leads to a mapping of the form $$\rho \ : \ A \to k_n$$?Can someone please explain how this follows? (I am not sure I really understand the notation!)

Peter
 
Peter said:
A further question I have regarding representations is as follows:In the text above we find the following text:"Given a finite-dimensional algebra $$A$$ over a field $$k$$, of dimension $$n$$, say, consider the right multiplication by an element $$a \in A$$:

$$\rho_a \ : \ x \mapsto xa $$ where $$x \in A$$ … … … (2.10)

This is a linear transformation of $$A$$ and so can be represented by an $$n \times n$$ matrix over $$k$$. Thus we have a mapping $$\rho \ : \ A \to k_n$$, and this is easily seen to be a homomorphism. … … "
I cannot see how (2.10) leads to a mapping of the form $$\rho \ : \ A \to k_n$$?Can someone please explain how this follows? (I am not sure I really understand the notation!)

Peter
I assume that $k_n$ is Cohn's notation for the set of $n\times n$ matrices over $k$. The procedure for representing the map $\rho_a$ by a matrix is just the standard method for associating a matrix with a linear transformation of a finite-dimensional vector space.

Given a linear transformation $T: V\to V$ from an $n$-dimensional vector space $V$ to itself, you first choose a basis $\{e_1,e_2,\ldots,e_n\}$ for $V$. Then you express each of the vectors $Te_1,Te_2,\ldots,Te_n$ as a linear combination of $e_1,\ldots,e_n$, and you use the coefficients in those combinations to form the rows (or is it the columns? – see Note below) of an $n\times n$ matrix. That is the procedure that Cohn is using to construct the regular representation of $A$.

In that extract from Cohn's book, the need to use a basis for $A$ in order to construct the matrix representation $\rho$ is explained in the paragraph leading up to equation (2.12).

[Note. In my previous comment, I initially had the rows and columns of the matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}$ the wrong way round (I have changed it now). My excuse is that as an analyst I am used to using the left regular representation, which in this example leads to the transpose matrix. Algebraists traditionally prefer the right regular representation (as Cohn does), which I find confusing.]
 
Last edited:
Opalg said:
One very simple example would be the complex numbers, considered as a two-dimensional algebra over the field of real numbers.

If we take $\{1,i\}$ as a basis for $\mathbb{C}$ then every element of $\mathbb{C}$ can be expressed in the form $x+iy$ with $x,y \in\mathbb{R}$. Given an element $a+ib \in\mathbb{C}$, the right multiplication $\rho_{a+ib}$ is the map $x+iy \mapsto (x+iy)(a+ib) = (xa-yb) + i(xb+ya)$.

Now use the basis $\{1,i\}$ to represent the complex number $x+iy$ by the vector $[x,y]$. Then the equation $\rho_{a+ib}(x+iy) = (xa-yb) + i(xb+ya)$ corresponds to the matrix equation $$[xa-yb,\,xb+ya] = [x,\,y]\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$$ Thus the right regular representation of $\mathbb{C}$ (with respect to the basis $\{1,i\}$) takes the complex number $a+ib$ to the $2\times2$ matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$

To verify that the regular representation is a multiplicative mapping in this example, notice that the product of complex numbers $(a+ib)(c+id) = (ac-bd) + i(ad+bc)$ corresponds to the matrix equation $\begin{bmatrix}a&b\\ -b&a \end{bmatrix} \begin{bmatrix}c&d\\ -d&c \end{bmatrix} = \begin{bmatrix}ac -bd&ad+bc\\ -(ad+bc)&ac-bd \end{bmatrix} .$

Hi Opalg,

Just a clarification … …

In your post above you write:

" … … Thus the right regular representation of $\mathbb{C}$ (with respect to the basis $\{1,i\}$) takes the complex number $a+ib$ to the $2\times2$ matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$ … … … "

However, Cohn writes that the right multiplication $$\rho_a$$ is called a regular representation and $$\rho_a$$ maps $$x$$ to $$xa$$ not a to a matrix … can you clarify?

Peter
 
Opalg said:
I assume that $k_n$ is Cohn's notation for the set of $n\times n$ matrices over $k$. The procedure for representing the map $\rho_a$ by a matrix is just the standard method for associating a matrix with a linear transformation of a finite-dimensional vector space.

Given a linear transformation $T: V\to V$ from an $n$-dimensional vector space $V$ to itself, you first choose a basis $\{e_1,e_2,\ldots,e_n\}$ for $V$. Then you express each of the vectors $Te_1,Te_2,\ldots,Te_n$ as a linear combination of $e_1,\ldots,e_n$, and you use the coefficients in those combinations to form the rows (or is it the columns? – see Note below) of an $n\times n$ matrix. That is the procedure that Cohn is using to construct the regular representation of $A$.

In that extract from Cohn's book, the need to use a basis for $A$ in order to construct the matrix representation $\rho$ is explained in the paragraph leading up to equation (2.12).

[Note. In my previous comment, I initially had the rows and columns of the matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}$ the wrong way round (I have changed it now). My excuse is that as an analyst I am used to using the left regular representation, which in this example leads to the transpose matrix. Algebraists traditionally prefer the right regular representation (as Cohn does), which I find confusing.]
Hi Opalg … thanks for the help … …

You write:

"I assume that $k_n$ is Cohn's notation for the set of $n\times n$ matrices over $k$. The procedure for representing the map $\rho_a$ by a matrix is just the standard method for associating a matrix with a linear transformation of a finite-dimensional vector space."

Indeed … I checked carefully in Cohn's text and you are correct … …

Just reflecting on the rest of your post, now … …

Peter
 
Peter said:
Hi Opalg,

Just a clarification … …

In your post above you write:

" … … Thus the right regular representation of $\mathbb{C}$ (with respect to the basis $\{1,i\}$) takes the complex number $a+ib$ to the $2\times2$ matrix $\begin{bmatrix}a&b\\ -b&a \end{bmatrix}.$ … … … "

However, Cohn writes that the right multiplication $$\rho_a$$ is called a regular representation and $$\rho_a$$ maps $$x$$ to $$xa$$ not a to a matrix … can you clarify?

Peter
It's true that the terminology gets a bit slippery here. The map $a\mapsto \rho_a$ takes $a\in A$ to $\rho_a$, which is a linear transformation of $A$. However, a linear transformation (of a finite-dimensional space) is often described by a matrix, and Cohn goes on to say "Such a homomorphism into a full matrix ring is called a matrix representation or simply a representation of $A$". So at that stage Cohn is identifying the linear transformation $\rho_a$ with its associated matrix. More precisely, he is defining a map $\rho:A\to k_n$ by saying that $\rho(a)$ is the matrix associated with the linear transformation $\rho_a$.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
10
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K