Application of Matrices and Determinants

Click For Summary

Discussion Overview

The discussion revolves around the application of matrices and determinants, particularly in the context of vector spaces and cross products. Participants explore the conceptual underpinnings of these mathematical tools, their relationships, and their applications in physics and mathematics.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express confusion about the role of unit vectors (i, j, k) in the determinant used for calculating the cross product, questioning the rationale behind their placement in the matrix.
  • One participant describes the cross product as a "pseudo-determinant," noting that it yields a vector rather than a scalar, which is typical for real determinants.
  • Another participant suggests that the matrix represents a grid of values that can express a vector in three-dimensional space, but questions whether this reasoning is correct.
  • Some participants argue that the determinant used for cross products is not a true determinant but rather a useful calculation device.
  • There is a suggestion that matrices and determinants may have been developed specifically for applications in physics, although this is not universally accepted.
  • One participant reflects on the simplicity of matrices, realizing that they can represent mathematical expressions in various ways, including algebraically.
  • A specific transformation example is provided, illustrating how a matrix can map vectors from one space to another, although the details of deriving the matrix are not discussed.

Areas of Agreement / Disagreement

Participants exhibit a mix of agreement and disagreement regarding the nature of determinants and their applications. While some concepts are understood, the discussion reveals uncertainty about the connections between matrices, determinants, and vector operations, with no consensus reached on the foundational ideas.

Contextual Notes

Participants express limitations in their understanding of the connections between matrices, determinants, and vector products, indicating a lack of formal education in linear algebra. The discussion also highlights the potential for multiple interpretations of the role and significance of cross products and determinants.

Obliv
Messages
52
Reaction score
1
Hello I was learning about determinants and matrices. I learned the generalization of getting the determinant of an n by n matrix. I then applied this to vector space (i + j + k) via a cross product and noticed that you leave the i j and k in their own columns in the first row of the matrix. What's the idea behind this? Can you give an example when you leave this part in elsewhere? Does this happen only when applying matrices to vector space? I didn't take linear algebra yet and if you could clear this up I'd really appreciate it. Thank you
 
Physics news on Phys.org
Obliv said:
Hello I was learning about determinants and matrices. I learned the generalization of getting the determinant of an n by n matrix. I then applied this to vector space (i + j + k) via a cross product and noticed that you leave the i j and k in their own columns in the first row of the matrix. What's the idea behind this?
When you calculate the cross product of two three-dimensional vectors, you use a pseudo-determinant, with the unit vectors i, j, and k in the first row, as you described. It's a pseudo-determinant, because a real determinant evaluates to a scalar (a number), not a vector, as is the case when you calculate the cross product.
Obliv said:
Can you give an example when you leave this part in elsewhere? Does this happen only when applying matrices to vector space?
I'm not aware of any other determinant that is a mix of vectors and numbers. The cross product is useful, but limited mostly to vectors in three-dimensional spaces (although I've heard there is a counterpart for one higher-dimensional space). A more generally useful product is the dot product (or inner product), which can be applied to a much wider range of vector spaces.
BTW, you don't "apply" matrices to a vector space.A transformation from one space to another can be represented by a matrix. A transformation is like a function, but more general.
Obliv said:
I didn't take linear algebra yet and if you could clear this up I'd really appreciate it. Thank you
 
What is the matrix transforming the vector space into though? I figure that a matrix is like a grid of values that represents an expression or the value of some system. So I imagine using a matrix as a means of representing the value of a vector in 3-D vector space. Then, finding the determinant of the matrix of two vectors yields the normal vector, otherwise known as the cross product of the two vectors. (is this reasoning correct?)

This seems to work really well and makes me think there is some correlation between the two ideas. It makes me wonder if matrices/determinants were created specifically for this kind of math in physics.

I understand how to do determinants and cross-products but it's annoying me that I don't know what the connection is or from what ideas this math and application of math spawned from. I guess what I'm asking is, where did the idea of this pseudo-determinant come from?
 
Obliv said:
What is the matrix transforming the vector space into though?
The matrix isn't "transforming a vector space into something," the transformation maps (or pairs) vectors in one space to vectors in the same or a different vector space. A transformation can have many different matrix representations, depending on the basis that is used in each of the two spaces.
Obliv said:
I figure that a matrix is like a grid of values that represents an expression or the value of some system. So I imagine using a matrix as a means of representing the value of a vector in 3-D vector space.
No. The value of a vector is determined by its components.
Obliv said:
Then, finding the determinant of the matrix of two vectors yields the normal vector, otherwise known as the cross product of the two vectors. (is this reasoning correct?)
I get what you're saying, but I think you're reading more importance into it than it deserves. The determinant that is used for calculating a cross product isn't really a determinant. It's just a useful device for doing the calculation.
Obliv said:
This seems to work really well and makes me think there is some correlation between the two ideas. It makes me wonder if matrices/determinants were created specifically for this kind of math in physics.

I understand how to do determinants and cross-products but it's annoying me that I don't know what the connection is or from what ideas this math and application of math spawned from. I guess what I'm asking is, where did the idea of this pseudo-determinant come from?
You got me. Look up "cross product" in Wikipedia. There might be some info there.
 
Obliv said:
I understand how to do determinants and cross-products but it's annoying me that I don't know what the connection is or from what ideas this math and application of math spawned from. I guess what I'm asking is, where did the idea of this pseudo-determinant come from?

Cross products are a useful kludge that don't have much of anything to with determinants.
The meaning of cross products is not explained to lower divisions so don't try to make sense out of them. Just use them to solve problems and pass the test.
 
Mark44, can you expand on this transformation idea? I'll search it up when I get home too.

Actually, now that I analyze more of what matrices actually are, it seems like they not as complicated as I thought them out to be. They seem like just another way of doing mathematical expressions. I could write out matrices and take the determinant to find vector products or I could just write it out algebraically and find the same vector products. I did not realize you could do this until I looked further into cross products. I had thought taking the determinant of the matrix was the only way of getting the product so I wanted to know the intuition. I was asking the wrong question it looks like..
 
Obliv said:
Mark44, can you expand on this transformation idea? I'll search it up when I get home too.
Here's an example:
##T: \mathbb{R}^3 \to \mathbb{R}^2##, defined by ##T\left(\begin{bmatrix} a \\ b \\ c\end{bmatrix}\right) = \begin{bmatrix} a + b \\ b + c\end{bmatrix}##
It turns out that a matrix representation of T is ##\begin{bmatrix} 1 & 1 & 0 \\ 0 & 1 & 1 \end{bmatrix}##
It would take too much space to describe how I got the matrix, so I won't. The transformation T maps vectors in ##\mathbb{R}^3## (space) to vectors in ##\mathbb{R}^2## (the plane).

You can play with this by making up a vector in space (3-D), using it as input to the transformation, and seeing what comes out.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
13
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 9 ·
Replies
9
Views
2K