Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B Application of Matrices and Determinants

  1. Mar 1, 2016 #1
    Hello I was learning about determinants and matrices. I learned the generalization of getting the determinant of an n by n matrix. I then applied this to vector space (i + j + k) via a cross product and noticed that you leave the i j and k in their own columns in the first row of the matrix. What's the idea behind this? Can you give an example when you leave this part in elsewhere? Does this happen only when applying matrices to vector space? I didn't take linear algebra yet and if you could clear this up I'd really appreciate it. Thank you
     
  2. jcsd
  3. Mar 1, 2016 #2

    Mark44

    Staff: Mentor

    When you calculate the cross product of two three-dimensional vectors, you use a pseudo-determinant, with the unit vectors i, j, and k in the first row, as you described. It's a pseudo-determinant, because a real determinant evaluates to a scalar (a number), not a vector, as is the case when you calculate the cross product.
    I'm not aware of any other determinant that is a mix of vectors and numbers. The cross product is useful, but limited mostly to vectors in three-dimensional spaces (although I've heard there is a counterpart for one higher-dimensional space). A more generally useful product is the dot product (or inner product), which can be applied to a much wider range of vector spaces.
    BTW, you don't "apply" matrices to a vector space.A transformation from one space to another can be represented by a matrix. A transformation is like a function, but more general.
     
  4. Mar 1, 2016 #3
    What is the matrix transforming the vector space into though? I figure that a matrix is like a grid of values that represents an expression or the value of some system. So I imagine using a matrix as a means of representing the value of a vector in 3-D vector space. Then, finding the determinant of the matrix of two vectors yields the normal vector, otherwise known as the cross product of the two vectors. (is this reasoning correct?)

    This seems to work really well and makes me think there is some correlation between the two ideas. It makes me wonder if matrices/determinants were created specifically for this kind of math in physics.

    I understand how to do determinants and cross-products but it's annoying me that I don't know what the connection is or from what ideas this math and application of math spawned from. I guess what I'm asking is, where did the idea of this pseudo-determinant come from?
     
  5. Mar 1, 2016 #4

    Mark44

    Staff: Mentor

    The matrix isn't "transforming a vector space into something," the transformation maps (or pairs) vectors in one space to vectors in the same or a different vector space. A transformation can have many different matrix representations, depending on the basis that is used in each of the two spaces.
    No. The value of a vector is determined by its components.
    I get what you're saying, but I think you're reading more importance into it than it deserves. The determinant that is used for calculating a cross product isn't really a determinant. It's just a useful device for doing the calculation.
    You got me. Look up "cross product" in Wikipedia. There might be some info there.
     
  6. Mar 1, 2016 #5
    Cross products are a useful kludge that don't have much of anything to with determinants.
    The meaning of cross products is not explained to lower divisions so don't try to make sense out of them. Just use them to solve problems and pass the test.
     
  7. Mar 1, 2016 #6
    Mark44, can you expand on this transformation idea? I'll search it up when I get home too.

    Actually, now that I analyze more of what matrices actually are, it seems like they not as complicated as I thought them out to be. They seem like just another way of doing mathematical expressions. I could write out matrices and take the determinant to find vector products or I could just write it out algebraically and find the same vector products. I did not realize you could do this until I looked further into cross products. I had thought taking the determinant of the matrix was the only way of getting the product so I wanted to know the intuition. I was asking the wrong question it looks like..
     
  8. Mar 1, 2016 #7

    Mark44

    Staff: Mentor

    Here's an example:
    ##T: \mathbb{R}^3 \to \mathbb{R}^2##, defined by ##T\left(\begin{bmatrix} a \\ b \\ c\end{bmatrix}\right) = \begin{bmatrix} a + b \\ b + c\end{bmatrix}##
    It turns out that a matrix representation of T is ##\begin{bmatrix} 1 & 1 & 0 \\ 0 & 1 & 1 \end{bmatrix}##
    It would take too much space to describe how I got the matrix, so I won't. The transformation T maps vectors in ##\mathbb{R}^3## (space) to vectors in ##\mathbb{R}^2## (the plane).

    You can play with this by making up a vector in space (3-D), using it as input to the transformation, and seeing what comes out.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Application of Matrices and Determinants
  1. On Matrices (Replies: 8)

Loading...