Exploring Subspaces in Matrices: Symmetric vs. Skew-Symmetric

In summary, the conversation discussed the proof that the set of all 2x2 "symmetric" matrices is a subspace of Mat2x2(lR). It was mentioned that this proof could not be applied to skew-symmetric matrices, and an example was given to show this. The properties of skew-symmetric matrices and their operations were also discussed. The conversation ended with a question about where the proof may have gone wrong, and the suggestion to consider the kernel of any linear map as a subspace.
  • #1
mang733
5
0
I was able to show that the set of all 2x2 "symmetric" matrices is a subspace of Mat2x2(lR) using the 3 axioms. However, I wasn't able to do the same with skew-symmetric (A^T=-A). anyone can help? Thanks.
 
Physics news on Phys.org
  • #2
There's reason you can't! If A is the matrix with entries [0 1] and [-1 0] it is skew symmetric. What is A2?
 
  • #3
Subspace, Halls, squaring it doesn't matter.

0^t=0=-0

If A^t =-A, then (kA)^t = -kA, for k a scalar

(A+B)^t = A^t+B^t, so if they're skew symmetric then this is -A-B = -(A+B)

so where did you proof go wrong?
 
  • #4
the kernel of any linear map is a subspace.

consider the map taking A to (A + A^t). for symetric ones consider A goes to (A-A^t).
 
  • #5
thanks for the help guys.
 

FAQ: Exploring Subspaces in Matrices: Symmetric vs. Skew-Symmetric

What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations, vectors, matrices, and linear transformations. It is used to solve systems of equations and model real-world phenomena.

What are the basic concepts of linear algebra?

The basic concepts of linear algebra include vector spaces, linear transformations, matrices, determinants, and eigenvalues and eigenvectors. Vector spaces are sets of objects that can be added and multiplied by scalars. Linear transformations are functions that map vectors from one space to another. Matrices are rectangular arrays of numbers. Determinants are values that describe the scaling factor of a linear transformation. Eigenvalues and eigenvectors are special vectors that remain unchanged when multiplied by a particular matrix.

What are the real-life applications of linear algebra?

Linear algebra has numerous real-life applications, including computer graphics, data compression and encryption, image processing, economics and finance, physics, and statistics. It is also used in machine learning and artificial intelligence to solve problems and make predictions.

How do I solve a system of linear equations?

To solve a system of linear equations, you can use different methods such as substitution, elimination, or matrix operations. The goal is to find the values of the variables that satisfy all equations in the system. This can be done by manipulating the equations or using matrices to represent the system.

What is the importance of linear algebra in data science?

Linear algebra is a fundamental component of data science. It is used to create and manipulate datasets, perform data analysis, and build predictive models. It provides the necessary tools and techniques to handle large and complex datasets, make sense of the data, and extract valuable insights from it.

Back
Top