Is A Skew Symmetric?

Click For Summary
SUMMARY

The discussion centers on proving that a real matrix A is skew-symmetric, specifically that A^t = -A, given the condition = 0 for all x in R^n. The proof involves expressing in terms of the matrix components a_{ij} and manipulating the resulting equations. Key steps include substituting specific vectors and demonstrating that a_{ji} = -a_{ij} for all indices i and j, confirming the skew-symmetry of matrix A.

PREREQUISITES
  • Understanding of scalar products in R^n
  • Familiarity with matrix transposition and properties
  • Knowledge of linear algebra concepts, specifically skew-symmetric matrices
  • Ability to manipulate summations and indices in mathematical proofs
NEXT STEPS
  • Study the properties of skew-symmetric matrices in linear algebra
  • Learn about the implications of the adjoint operator in matrix theory
  • Explore the relationship between eigenvalues and skew-symmetric matrices
  • Investigate applications of skew-symmetric matrices in physics and engineering
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in the properties of matrices and their applications in various fields.

TTob
Messages
21
Reaction score
0
Let A in n x n real matrix.
For every x in R^n we have <Ax,x>=0 where < , > is scalar product.
prove that A^t=-A (A is skew symmetric matrix)
 
Physics news on Phys.org
TTob said:
Let A in n x n real matrix.
For every x in R^n we have <Ax,x>=0 where < , > is scalar product.
prove that A^t=-A (A is skew symmetric matrix)

Hi TTob! :smile:

Hint: just write out <Ax,x>=0 in terms of Aij etc …

then jiggle it around a bit! :smile:

What do you get?
 
Thank you.

note x=(x_1,...,x_n) and A=(a_{ij}).
then (Ax)_i= \sum_{\substack{0\leq j\leq n}} a_{ij}x_j
then
&lt;Ax,x&gt;= \sum_{\substack{0\leq j\leq n \\ 0\leq i\leq n}} a_{ij}x_i x_j=0

when you put x=e_i you get a_{ii}=0 for all i.
when you put x=e_i+e_j you get a_{ii}+a_{ij}+a_{ji}+a_{jj}=0 for all i,j .

so a_{ji}=-a_{ij} for all i,j and then A^t=-A.
 
Hi TTob! :smile:

Yes … or if you want to avoid using components …

0\ =\ &lt;A(x+y),x+y&gt;\ -\ &lt;Ax,x&gt;\ -\ &lt;Ay,y&gt;\ \ =\ \ &lt;Ax,y&gt;\ +\ &lt;Ay,x&gt;\ \ =\ \ &lt;(A + A^T)x,y&gt; :smile:
 
Or use the fact that <Ax, y>= <x, ATy>. (That is a more general definition of "adjoint")

From that, <Ax, x>= <x, ATx>. If A is skew-symmetric, <x, AT x>= -<Ax, x> so <Ax, x>= -<Ax, x> and <Ax, x>= 0.
 
HallsofIvy said:
… <Ax, x>= <x, ATx>. If A is skew-symmetric, <x, AT x>= -<Ax, x> so <Ax, x>= -<Ax, x> and <Ax, x>= 0.

Hi HallsofIvy! :smile:

erm …
I think you've proved the transpose of the original theorem! :wink:
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K