MHB Understanding One-Sided Ideals in Mathematics

  • Thread starter Thread starter cbarker1
  • Start date Start date
Click For Summary
The discussion centers on the concept of one-sided ideals in the context of matrices over a commutative ring. Specifically, it examines the set \( L_j \) of \( n \times n \) matrices with arbitrary entries in the jth column and zeroes elsewhere, demonstrating that \( L_j \) is a left ideal of \( M_n(R) \) but not a right ideal. The proof involves showing that the product of any matrix \( T \) from \( M_n(R) \) with a matrix \( A \) from \( L_j \) remains in \( L_j \), while a specific example illustrates that \( AT \) does not belong to \( L_1 \). The conversation also includes requests for mathematical notation clarification, confirming the understanding of the ideal's properties. Overall, the example effectively illustrates the distinction between left and right ideals in matrix algebra.
cbarker1
Gold Member
MHB
Messages
345
Reaction score
23
Dear Everyone,

I am reading the Abstract Algebra Book by Dummit and Foote. I am confusing with this example for one-side ideals. So here is the example:
Let $R$ be a commutative ring with $1 \ne 0$ and let \( n\in \mathbb{Z} \) with $n\ge 2$. For each $j\in \{1,2,\dots, n\}$, let $L_j$ be the set of all $n \times n$ matrices in $M_n(R)$ with arbitrary entries in the jth column and zeroes in all other columns. It is clear that $L_j$ is closed under subtraction. It follows directly from the definition of matrix multiplication that any matrix $T \in M_n(R)$ and $A \in L_j$, the product $TA$ has zero entries in the ith column for all $i\ne j$. This shows $L_j$ is a left ideal of $M_n(R)$. Moreover, $L_j$ is not a right ideal.

What does this example look like with math symbols?

Thanks,
Cbarker1
 
Physics news on Phys.org
Hi Cbarker1,

It's not entirely clear to me what your question is exactly. If you mean you'd like to see verification of the left ideal/non-right ideal claim, that would look something like this.

Proof $L_{j}$ is a Left Ideal
Let $A\in L_{j},$ $T\in M_{n}(R),$ and let $a_{j}$ be the $j$th column of $A$. Then $$TA = T\left[\begin{array}{c|c|c|c|c} 0 & \ldots & a_{j} & \ldots & 0 \end{array}\right] = \left[\begin{array}{c|c|c|c|c} 0 & \ldots & Ta_{j} & \ldots & 0 \end{array}\right],$$ which shows that $TA\in L_{j}$. Hence, $L_{j}$ is a left ideal over $M_{n}(R).$

Proof $L_{1}$ is not a Right Ideal
Let $a_{1} = \begin{bmatrix}1\\ 0\\ \vdots\\ 0 \end{bmatrix},$ $A = \left[\begin{array}{c|c|c|c} a_{1} & 0 &\ldots & 0 \end{array} \right],$ and $T = \left[\begin{array}{c|c|c|c|c} a_{1} & a_{1} & 0 &\ldots & 0 \end{array} \right].$ Then $AT = T\notin L_{1}.$ Hence, $L_{1}$ is not a right ideal over $M_{n}(R).$ This example can be generalized to any $j$, if desired.
 
Physics news on Phys.org
I am trying to see the symbols. Like this: \[ \begin{pmatrix} 1 & 2 & 3\\ a & b & c \end{pmatrix} \].
 
I am trying to see the symbols. Something Like this: \[ \begin{pmatrix} 1 & 2 & 3\\ a & b & c \end{pmatrix} \].
 
OK, let's see if this helps.

A matrix in $L_{j}$ would look be written as $$A = \begin{bmatrix} 0 & \ldots & a_{1j} & \ldots & 0\\ 0 & \ldots & a_{2j} & \ldots & 0\\ \vdots & \ddots & \vdots & \ddots & \vdots\\ 0 & \ldots & a_{nj} & \ldots & 0\end{bmatrix},$$ and a matrix $T\in M_{n}(R)$ would have the form $$T = \begin{bmatrix} t_{11} & \ldots & t_{1j} & \ldots & t_{1n}\\ t_{21} & \ldots & t_{2j} & \ldots & t_{2n}\\ \vdots & \ddots & \vdots & \ddots & \vdots\\ t_{n1} & \ldots & t_{nj} & \ldots & t_{nn}\\\end{bmatrix},$$ where all the $a$'s and $t$'s are elements of $R$. Using these forms for $A$ and $T$, the product $TA$ would be $$TA = \begin{bmatrix}0 & \ldots & t_{11}a_{1j} + t_{12}a_{2j}+ \ldots + t_{1n}a_{nj} & \ldots & 0\\ 0 & \ldots & \vdots & \ldots & 0\\ \vdots & \ddots & \vdots & \ddots & \vdots\\ 0 & \ldots & \ldots & \ldots & 0 \end{bmatrix},$$ where I have left rows $2$ through $n$ of the $j$th column of $TA$ for you to try to fill in for yourself.

Does this answer your question?
 
yes.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 21 ·
Replies
21
Views
1K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K