Is A Skew Symmetric?

Click For Summary

Discussion Overview

The discussion revolves around proving that a real matrix A is skew symmetric, specifically under the condition that for every vector x in R^n, the scalar product equals 0. The focus is on mathematical reasoning and proofs related to linear algebra properties of matrices.

Discussion Character

  • Mathematical reasoning
  • Technical explanation
  • Exploratory

Main Points Raised

  • Some participants propose that if = 0 for all x in R^n, then A must be skew symmetric, leading to the conclusion A^t = -A.
  • One participant suggests writing out in terms of matrix components a_{ij} and manipulating the expression to derive conditions on the elements of A.
  • Another participant mentions an alternative approach that avoids using components, suggesting to analyze the expression to derive properties of A.
  • There is a reference to the adjoint definition, where = is used to relate properties of A and its transpose.
  • One participant humorously notes that a previous argument may have inadvertently proven the transpose of the original theorem instead of the intended result.

Areas of Agreement / Disagreement

Participants generally agree on the approach to proving that A is skew symmetric, but there are multiple methods proposed, and the discussion remains exploratory without a definitive consensus on the preferred method.

Contextual Notes

Some steps in the mathematical reasoning are not fully resolved, and assumptions about the properties of the scalar product and matrix elements are implicit in the discussion.

TTob
Messages
21
Reaction score
0
Let A in n x n real matrix.
For every x in R^n we have <Ax,x>=0 where < , > is scalar product.
prove that A^t=-A (A is skew symmetric matrix)
 
Physics news on Phys.org
TTob said:
Let A in n x n real matrix.
For every x in R^n we have <Ax,x>=0 where < , > is scalar product.
prove that A^t=-A (A is skew symmetric matrix)

Hi TTob! :smile:

Hint: just write out <Ax,x>=0 in terms of Aij etc …

then jiggle it around a bit! :smile:

What do you get?
 
Thank you.

note x=(x_1,...,x_n) and A=(a_{ij}).
then (Ax)_i= \sum_{\substack{0\leq j\leq n}} a_{ij}x_j
then
&lt;Ax,x&gt;= \sum_{\substack{0\leq j\leq n \\ 0\leq i\leq n}} a_{ij}x_i x_j=0

when you put x=e_i you get a_{ii}=0 for all i.
when you put x=e_i+e_j you get a_{ii}+a_{ij}+a_{ji}+a_{jj}=0 for all i,j .

so a_{ji}=-a_{ij} for all i,j and then A^t=-A.
 
Hi TTob! :smile:

Yes … or if you want to avoid using components …

0\ =\ &lt;A(x+y),x+y&gt;\ -\ &lt;Ax,x&gt;\ -\ &lt;Ay,y&gt;\ \ =\ \ &lt;Ax,y&gt;\ +\ &lt;Ay,x&gt;\ \ =\ \ &lt;(A + A^T)x,y&gt; :smile:
 
Or use the fact that <Ax, y>= <x, ATy>. (That is a more general definition of "adjoint")

From that, <Ax, x>= <x, ATx>. If A is skew-symmetric, <x, AT x>= -<Ax, x> so <Ax, x>= -<Ax, x> and <Ax, x>= 0.
 
HallsofIvy said:
… <Ax, x>= <x, ATx>. If A is skew-symmetric, <x, AT x>= -<Ax, x> so <Ax, x>= -<Ax, x> and <Ax, x>= 0.

Hi HallsofIvy! :smile:

erm …
I think you've proved the transpose of the original theorem! :wink:
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 5 ·
Replies
5
Views
7K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K