Matrix multiplication, Orthogonal matrix, Independent parameters

In summary, you need to subtract the dimensions of the fibers of the map from the dimension of the space of all matrices in order to get the number of parameters in the case of an nxn orthogonal matrix.
  • #1
LagrangeEuler
717
20
Matrix multiplication is defined by
[tex]\sum_{k}a_{ik}b_{kj}[/tex] where ##a_{ik}## and ##b_{kj}## are entries of the matrices ##A## and ##B##. In definition of orthogonal matrix I saw
[tex]\sum_{k=1}^n a_{ki}a_{kj}=\delta_{ij}[/tex]
This is because ##A^TA=I##. How to know how many independent parameters we have in the case of nxn orthogonal matrix? So how many parameters you need to give me in order to know the other ones?
 
Physics news on Phys.org
  • #2
The number of independent elements will equal the number of elements minus the number of (independent) equations of constraint.
 
  • #3
Maybe think geometrically? You want to choose n vectors each one length one and perpendicular to all the previous ones. So in (real) n-space the first vector's head must be chosen on the unit sphere, which has dimension n-1. Then we want the next vector chosen from the unit sphere in the hyperplane orthogonal to the first vector, so that is a unit sphere in n-1 space, so has dimension n-2. So after choosing two vectors we have ranged over (n-1)+(n-2) dimensions. It seems by the principle of what else could it be, that the answer is the sum of the first n-1 positive integers, or n(n-1)/2 dimensions for the space of all such matrices. and the geometry of that matrix space seems to be a product of spheres of those dimensions. does this seem to make sense?

Here is a rough "check" on it: look at the map from all matrices to symmetric matrices, sending a matrix A to the product A*A, where A* is the transpose of A. (Do you see why A*A is symmetric?) Symmetric matrices have dimension 1+2+...+n = (n+1)n/2 since the diagonal terms and those above determine those below. Thus we have a (smooth) map from a space of dim n^2 to a space of dim (n+1)n/2 and we want the dimension of the preimage of the single point Id. Assuming this map is a nice one, say surjective and all fibers of the same dimension, we would get the dimension of a fiber by subtracting those dimensions, which gives n(n-1)/2. Of course this leaves out a lot of details, but is comforting nonetheless, at least to me. By the way, this second calculation is the one suggested by hutchphd.
 
Last edited:
  • Like
Likes jbunniii and hutchphd
  • #4
mathwonk said:
by the principle of what else could it be
:biggrin:
 

1. What is matrix multiplication and how is it performed?

Matrix multiplication is an operation where two matrices are multiplied together to create a new matrix. It is performed by multiplying the corresponding elements in each row of the first matrix with the corresponding elements in each column of the second matrix, and then summing the results. The resulting matrix will have the same number of rows as the first matrix and the same number of columns as the second matrix.

2. What is an orthogonal matrix and what are its properties?

An orthogonal matrix is a square matrix where the columns and rows are orthogonal to each other. This means that the dot product of any two columns (or rows) in the matrix is equal to 0. Additionally, the columns (or rows) of an orthogonal matrix are all unit vectors, meaning they have a magnitude of 1. Orthogonal matrices also have the property that their inverse is equal to their transpose.

3. How do independent parameters relate to matrix multiplication?

Independent parameters are the individual elements within a matrix. In matrix multiplication, the number of independent parameters in the resulting matrix is equal to the number of rows in the first matrix multiplied by the number of columns in the second matrix. This is because each element in the resulting matrix is a sum of products of independent parameters from the two original matrices.

4. Can orthogonal matrices be used to solve systems of linear equations?

Yes, orthogonal matrices can be used to solve systems of linear equations. This is because the dot product of two vectors can be used to find the angle between them, and in an orthogonal matrix, the columns (or rows) are all at right angles to each other. This property can be used to simplify the process of solving systems of linear equations.

5. What are some real-world applications of matrix multiplication and orthogonal matrices?

Matrix multiplication and orthogonal matrices have many real-world applications, including computer graphics, data compression, signal processing, and solving systems of linear equations. They are also used in machine learning algorithms and in the analysis of networks and graphs. Additionally, orthogonal matrices are used in physics and engineering for rotations, reflections, and other transformations.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
896
  • Linear and Abstract Algebra
Replies
6
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
11K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
930
  • Linear and Abstract Algebra
Replies
9
Views
875
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top