- #1

- 4

- 0

Upon instigating research on determiniants, all I've found are definitions that either only cover lower dimensional matrices or define the determinant to be some strange expression with little motivation.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter zeta12ti
- Start date

- #1

- 4

- 0

Upon instigating research on determiniants, all I've found are definitions that either only cover lower dimensional matrices or define the determinant to be some strange expression with little motivation.

- #2

lavinia

Science Advisor

Gold Member

- 3,259

- 642

Try proving this - say by induction.

- #3

- 489

- 0

To clarify, Lavinia means that the top exterior power of a vector space is one-dimensional. I.e. for an n-dimensional vector space V, the vector space of alternating, multilinear functions f:V^n->R is one-dimensional. This is only true for the top dimension.I think the determinant is the only mutilinear alternating function in the vector space - up to a constant. If you require the map to have a value of 1 on the identity matrix then you get the determinant.

- #4

lavinia

Science Advisor

Gold Member

- 3,259

- 642

To clarify, Lavinia means that the top exterior power of a vector space is one-dimensional. I.e. for an n-dimensional vector space V, the vector space of alternating, multilinear functions f:V^n->R is one-dimensional. This is only true for the top dimension.

Yes. this is the same statement.

the determinant is a polynomial in the matrix entries. I wonder how this all translates into the properties of this polynomial.

- #5

lavinia

Science Advisor

Gold Member

- 3,259

- 642

Yes. this is the same statement.

the determinant is a polynomial in the matrix entries. I wonder how this all translates into the properties of this polynomial.

How about this as another way to think of the determinant. Consider all functions on square matrices that are constant under conjugation.

[tex] f(X) = f(AXA ^-1) [/tex]

for all square matrices X and invertible square matrices ,A.

For instance the trace is such a function.

Define the determinant to be that function which equals the product of the diagonal entries on any diagonal matrix.

Last edited:

- #6

- 4

- 0

Thanks for the responses. However, I have a question.

Is the determinant uniquely determined by its multilinearity and the properties I mentioned above?

I am hoping that the determinant can be made in to a type of analogue of the complex abosolute value with as few other conditions as possible (alternating multilinear is a bit too...esoteric for my tastes)

Is the determinant uniquely determined by its multilinearity and the properties I mentioned above?

I am hoping that the determinant can be made in to a type of analogue of the complex abosolute value with as few other conditions as possible (alternating multilinear is a bit too...esoteric for my tastes)

Last edited:

- #7

lavinia

Science Advisor

Gold Member

- 3,259

- 642

Thanks for the responses. However, I have a question.

Is the determinant uniquely determined by its multilinearity and the properties I mentioned above?

I am hoping that the determinant can be made in to a type of analogue of the complex abosolute value with as few other conditions as possible (alternating multilinear is a bit too...esoteric for my tastes)

alternating multilinear on any n-vectors uniquely determines the determinant up to a constant.

- #8

- 131

- 40

- #9

- 1,015

- 70

Is the determinant uniquely determined by its multilinearity and the properties I mentioned above?

I am hoping that the determinant can be made in to a type of analogue of the complex abosolute value with as few other conditions as possible (alternating multilinear is a bit too...esoteric for my tastes)

An equivalent definition is that the determinant is the product of the eigenvalues of the linear transformation that corresponds to the matrix (ie., it is the factor of signed scale of n-paralleletopes embedded in the target vector space versus the domain). This definition is a bit more intuitive and easier to work with geometrically. If you're looking for a formula that refers directly to the entries of an arbitrary matrix representation of the linear transformation, then you are left with the alternating multilinear map, embodied in the Levi-Civita symbol or Laplace's expansion by minors (equivalent, just re-ordered).

The relation between the two definitions is derived through basic geometric algebra (the wedge product).

Last edited:

- #10

lurflurf

Homework Helper

- 2,440

- 138

Share: