Prove the theorem for the matrix

In summary, it is proven that every square real matrix X can be written uniquely as the sum of a symmetric matrix A and a skew-symmetric matrix B. This is done by taking the transpose of A and B, and proving uniqueness through contradiction.
  • #1
frostshoxx
6
0

Homework Statement



Prove that every square real matrix X can be written in a unique way as the sum of a symmetric matrix A and a skew-symmetric matrix B.

Homework Equations



X = A + B
A = [tex]\frac{X+X^{T}}{2}[/tex]
B = [tex]\frac{X-X^{T}}{2}[/tex]
X = [tex]\frac{X+X^{T}}{2}[/tex] + [tex]\frac{X-X^{T}}{2}[/tex]


The Attempt at a Solution



So I tried to solve [tex]\frac{X+X^{T}}{2}[/tex] + [tex]\frac{X-X^{T}}{2}[/tex] and it gives out X as a solution. However, how can I know that A is a symmetric and B is a skew-symmetric? Any idea?
 
Physics news on Phys.org
  • #2
Take the transpose of A and B. You also need to prove uniqueness which I would do by contradiction.
 
  • #3
Can this be done symbolically? Also, what do you mean by contradiction? could you give some examples?

Thank you for your time.
 
  • #4
Yes why not, if you take transpose of A, you will get A again. And B is skew because of the negative sign.

Example of uniqueness. Let e be a number (in reals) such that[tex]a \cdot a^{-1}=e[/tex] and [tex]a\cdot e=a \quad \forall a \in \mathbb{R}[/tex]. e is unique.

Proof:
Fix a in reals and assume e is not unique. You have [tex] a\cdot e=a[/tex] and [tex] a\cdot e'=a[/tex] for [tex]e\neq e'[/tex] (same for inverses). Now you have
[tex] a\cdot e \cdot e'=a \cdot e'=a[/tex]
taking inverses gives the result that [tex] e \cdot e'=e[/tex] and [tex] e \cdot e'=e'[/tex]
thus [tex]e=e'[/tex] which contradicts the assumption, thus e must be unique.

Hope that helps
 

1. What is a matrix?

A matrix is a rectangular array of numbers or symbols arranged in rows and columns. It is commonly used in mathematics, physics, and computer science to represent linear transformations and systems of equations.

2. What is a theorem for a matrix?

Theorem for a matrix is a mathematical statement that has been proven to be true for all matrices. It is a fundamental result that can be used to solve problems involving matrices.

3. How do you prove a theorem for a matrix?

To prove a theorem for a matrix, you need to use logical reasoning and mathematical techniques such as induction, contradiction, or direct proof. It is important to understand the properties and operations of matrices to successfully prove a theorem.

4. Why is it important to prove theorems for matrices?

Proving theorems for matrices is important because it allows us to understand the behavior and properties of matrices. It also helps us to apply matrix operations and transformations in practical applications such as computer graphics, quantum mechanics, and data analysis.

5. What are some common theorems for matrices?

Some common theorems for matrices include the determinant theorem, inverse matrix theorem, eigenvalue theorem, and rank-nullity theorem. These theorems are essential in solving problems involving matrix operations, transformations, and systems of linear equations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
7
Views
553
  • Calculus and Beyond Homework Help
Replies
5
Views
570
  • Calculus and Beyond Homework Help
Replies
7
Views
280
  • Calculus and Beyond Homework Help
Replies
3
Views
569
  • Calculus and Beyond Homework Help
Replies
3
Views
281
Replies
4
Views
497
  • Calculus and Beyond Homework Help
Replies
3
Views
328
  • Calculus and Beyond Homework Help
Replies
3
Views
812
  • Calculus and Beyond Homework Help
Replies
9
Views
805
  • Calculus and Beyond Homework Help
Replies
12
Views
989
Back
Top