- #1

fxdung

- 386

- 23

What are the suitable books in linear algebra for third course for self-study after reading Linear Algebra done right by Axler and Algebra by Artin?

Last edited:

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Linear Algebra
- Thread starter fxdung
- Start date

- #1

fxdung

- 386

- 23

Last edited:

- #2

S.G. Janssens

Science Advisor

Education Advisor

- 1,223

- 818

Maybe you would like Roman's Advanced Linear Algebra.

- #3

fxdung

- 386

- 23

But Roman omit some proofs. Is there any book more detail?

- #4

Sam Gallagher

- 9

- 6

* Module theory

* Multilinear algebra

* Functional analysis

* Representation theory

* Numerical methods in linear algebra

If you've come this far, you'll know the field of 'linear algebra' is really a first-step towards each of the above, and they all offer different extensions. If you can be more specific people could recommend better books?

- #5

fxdung

- 386

- 23

I haven't had that knowledge. So I need a general book higher than Axler's book

- #6

MidgetDwarf

- 1,393

- 549

- #7

fxdung

- 386

- 23

- #8

- 13,300

- 5,726

Are you sure that you want a third book on

- #9

fxdung

- 386

- 23

I need to go deeper in Linear Algebra

- #10

S.G. Janssens

Science Advisor

Education Advisor

- 1,223

- 818

Need or want?I need to go deeper in Linear Algebra

Both is fine, linear algebra is alive as a research field by itself, not only as background knowledge for other fields. (Although its connection with other fields makes it arguably more interesting.)

Maybe try browsing and reading some articles. This is freely accessible:

https://journals.uwyo.edu/index.php/ela

zbMATH curates a searchable journal list that is free to consult.

- #11

StoneTemplePython

Science Advisor

Gold Member

- 1,260

- 597

You are asking for a bookI haven't had that knowledge. So I need a general book higher than Axler's book

Btw, if you want more comprehensive knowledge an easy upgrade is to do the (especially starred) problems in Artin 1st ed. He dumbed down the problems in the 2nd edition -- i.e. cut down the raw number and eliminated a lot of high insight but difficult problems.

- #12

Infrared

Science Advisor

Gold Member

- 988

- 550

I need to go deeper in Linear Algebra

Why? Are there specific topics that you want to learn about?

- #13

martinbn

Science Advisor

- 3,051

- 1,391

Kostrikin and Manin "Linear algebra and geometry".

- #14

- 17,645

- 18,339

... in which case it shouldn't be a problem to answer ...

... because ...Why? Are there specific topics that you want to learn about?

... requires specific topics that you call deeper, in order to make specific suggestions.I need to go deeper in Linear Algebra

- #15

fxdung

- 386

- 23

Some online University teach PhD on Linear Algebra, so I need a deeper in general Linear Algebra

- #16

- 17,645

- 18,339

Let me see where you are. Maybe this helps us to figure out what you should do.

Let ##\psi## be a linear transformation of the inner product space ##E##. Define the linear automorphism ##\exp \psi## by

$$

\exp\psi =\varphi (1)

$$

where ##\varphi (t)## is the family of linear automorphisms defined by

$$

\dot\varphi (t)=\psi\circ \varphi (t)\, , \,\varphi (0)=\operatorname{id}.

$$

Prove that

$$

\varphi (t)=\exp(t\psi) \quad (-\infty <t<\infty ).

$$

https://www.amazon.com/dp/0387901108/?tag=pfamazon01-20

p. 258

Let ##\psi## be a linear transformation of the inner product space ##E##. Define the linear automorphism ##\exp \psi## by

$$

\exp\psi =\varphi (1)

$$

where ##\varphi (t)## is the family of linear automorphisms defined by

$$

\dot\varphi (t)=\psi\circ \varphi (t)\, , \,\varphi (0)=\operatorname{id}.

$$

Prove that

$$

\varphi (t)=\exp(t\psi) \quad (-\infty <t<\infty ).

$$

https://www.amazon.com/dp/0387901108/?tag=pfamazon01-20

p. 258

Last edited:

- #17

fxdung

- 386

- 23

- #18

MidgetDwarf

- 1,393

- 549

By read, do you mean working through the exercises without looking at solutions?

- #19

fxdung

- 386

- 23

I intent to do exercices after re-reading the books. I like have a general view about mathematics.

- #20

MidgetDwarf

- 1,393

- 549

Thats not how science or math books works. You have to do the exercises...I intent to do exercices after re-reading the books. I like have a general view about mathematics.

- #21

Vanadium 50

Staff Emeritus

Science Advisor

Education Advisor

- 29,601

- 15,055

This explains why you have posted so many messages struggling with various things. @MidgetDwarf is right.I intent to do exercices after re-reading the books.

- #22

- 17,645

- 18,339

Sure, Greub is a good book. But it will not solve your problem. You canSo I haven't been able to solve problem in Greub.Is it good to read Greub after finish Artin?

The key attitude when

E.g. a rotation is a linear transformation. Then you have to think that it is not, but the book

- #23

fxdung

- 386

- 23

Is that way OK?

- #24

berkeman

Mentor

- 64,179

- 15,398

IMO, No. If you work through a chapter in a textbook and skip the exercises, you are wasting your time (and ours, BTW) if you want to be sure you are effectively learning the material. The exercises are a test of your understanding. "Self Learning" does not mean that you get to skip the learning part and just get a general idea of the material.

Is that way OK?

How about we pause this thread until you go back and re-read your first couple of textbooks and work through the exercises. When you can show us the solution to the straightforward math quiz question posed by @fresh_42 we can resume this discussion thread...

- #25

- 13,300

- 5,726

I understand why others criticize you, but I have to tell you that I use similar strategy when I

Is that way OK?

- #26

mathwonk

Science Advisor

Homework Helper

- 11,392

- 1,631

https://www.amazon.com/dp/3540642439/?tag=pfamazon01-20

also, IF legible! my free notes: 845-1, 845-2, 845-3:

https://www.math.uga.edu/directory/people/roy-smith

- #27

mathwonk

Science Advisor

Homework Helper

- 11,392

- 1,631

For what it's worth, here is my summary of the theory of normal forms of finite dimensional linear operators (more or less the content of a first or second course of linear algebra):

Given a linear operator T on a finite dimensional k-vector space V,

V has a decomposition into a product of subspaces Wj on each of which T is equivalent to the action of multiplication by X on a quotient space k[X]/(fj) of a polynomial ring. I.e. if Wj is a subspace corresponding to a polynomial fj, there is an isomorphism from Wj to k[X]/(fj) under which the action of T on Wj corresponds to multiplication by X on k[X]/(fj). In particular, fj is the minimal polynomial of T on Wj. Thus understanding the behavior of T on V is accomplished by finding these polynomials fj and the corresponding subspaces Wj.

The first clue to finding these polynomials is that their product is equal to the "characteristic polynomial" of T, so the problem is to find the appropriate factorization of that polynomial. E.g. if the characteristic polynomial is irreducible over k, there is only one subspace W=V, and f = the characteristic polynomial. In general, the distinguished subspaces can be chosen so that the corresponding sequence of monic polynomials f1,...,fr successively divide each other, and when this is done this special sequence of polynomials, called "invariant factors" of T, is uniquely determined by T. In this case the largest degree one, fr, is the minimal polynomial of T on V.

Thus if the characteristic polynomial is a product of distinct irreducible polynomials, there is again only one subspace W=V, r=1, and f equals the characteristic polynomial. In general, the “invariant factor decomposition”, can be computed by hand from any matrix for T, by diagonalizing the associated "characteristic matrix", using the Euclidean algorithm in k[X].

Two operators S,T are “similar”, i.e. T = (U^-1)SU for some invertible operator U, if and only if S,T have the same invariant factors. If M is a matrix for S in some basis, another way to say T is similar to S, is that there is some basis in which T also has the matrix M.

A second standard decomposition exists where the polynomials fj in the model

spaces k[X]/(fj) are all powers of irreducible polynomials. For this decomposition, the

sequence of polynomials fj is almost uniquely determined by T, except for a chosen

ordering of the irreducible polynomials.

This second decomposition, called the “generalized Jordan decomposition”, always exists in theory, but can be computed in practice only for those examples where the irreducible factors of the characteristic polynomial of T can actually be found, e.g. for a “triangular” matrix.

A special case of the Jordan decomposition occurs precisely when the minimal

polynomial factors completely into distinct linear factors. Then the Jordan form,

which may or may not be effectively computable, is a diagonal matrix. This is

always the case when the matrix consists of real entries which are symmetric about

the main diagonal, although even then one may not be able to perform the factorization in practice, nor to actually find the numerical entries on the diagonal. In that event one may turn to approximation techniques to estimate these "eigenvalues".

If a real matrix does not equal its "transpose", (= its reflection about the diagonal), but does commute with it, then the minimal polynomial is again a product of distinct irreducible factors, hence of degree ≤ 2, and the subspaces Wj all have dimension ≤ 2. If a complex matrix commutes with the complex conjugate of its transpose, its minimal polynomial is also a product of distinct irreducible factors, and since these must all be linear it is actually diagonalizable. In all cases where the matrix either equals or commutes with its transpose, the decomposing subspaces can all be chosen mutually orthogonal , indeed subspaces corresponding to distinct irreducible polynomials are automatically orthogonal.

Given a linear operator T on a finite dimensional k-vector space V,

V has a decomposition into a product of subspaces Wj on each of which T is equivalent to the action of multiplication by X on a quotient space k[X]/(fj) of a polynomial ring. I.e. if Wj is a subspace corresponding to a polynomial fj, there is an isomorphism from Wj to k[X]/(fj) under which the action of T on Wj corresponds to multiplication by X on k[X]/(fj). In particular, fj is the minimal polynomial of T on Wj. Thus understanding the behavior of T on V is accomplished by finding these polynomials fj and the corresponding subspaces Wj.

The first clue to finding these polynomials is that their product is equal to the "characteristic polynomial" of T, so the problem is to find the appropriate factorization of that polynomial. E.g. if the characteristic polynomial is irreducible over k, there is only one subspace W=V, and f = the characteristic polynomial. In general, the distinguished subspaces can be chosen so that the corresponding sequence of monic polynomials f1,...,fr successively divide each other, and when this is done this special sequence of polynomials, called "invariant factors" of T, is uniquely determined by T. In this case the largest degree one, fr, is the minimal polynomial of T on V.

Thus if the characteristic polynomial is a product of distinct irreducible polynomials, there is again only one subspace W=V, r=1, and f equals the characteristic polynomial. In general, the “invariant factor decomposition”, can be computed by hand from any matrix for T, by diagonalizing the associated "characteristic matrix", using the Euclidean algorithm in k[X].

Two operators S,T are “similar”, i.e. T = (U^-1)SU for some invertible operator U, if and only if S,T have the same invariant factors. If M is a matrix for S in some basis, another way to say T is similar to S, is that there is some basis in which T also has the matrix M.

A second standard decomposition exists where the polynomials fj in the model

spaces k[X]/(fj) are all powers of irreducible polynomials. For this decomposition, the

sequence of polynomials fj is almost uniquely determined by T, except for a chosen

ordering of the irreducible polynomials.

This second decomposition, called the “generalized Jordan decomposition”, always exists in theory, but can be computed in practice only for those examples where the irreducible factors of the characteristic polynomial of T can actually be found, e.g. for a “triangular” matrix.

A special case of the Jordan decomposition occurs precisely when the minimal

polynomial factors completely into distinct linear factors. Then the Jordan form,

which may or may not be effectively computable, is a diagonal matrix. This is

always the case when the matrix consists of real entries which are symmetric about

the main diagonal, although even then one may not be able to perform the factorization in practice, nor to actually find the numerical entries on the diagonal. In that event one may turn to approximation techniques to estimate these "eigenvalues".

If a real matrix does not equal its "transpose", (= its reflection about the diagonal), but does commute with it, then the minimal polynomial is again a product of distinct irreducible factors, hence of degree ≤ 2, and the subspaces Wj all have dimension ≤ 2. If a complex matrix commutes with the complex conjugate of its transpose, its minimal polynomial is also a product of distinct irreducible factors, and since these must all be linear it is actually diagonalizable. In all cases where the matrix either equals or commutes with its transpose, the decomposing subspaces can all be chosen mutually orthogonal , indeed subspaces corresponding to distinct irreducible polynomials are automatically orthogonal.

Last edited:

Share:

- Last Post

- Replies
- 1

- Views
- 702

- Replies
- 12

- Views
- 809

- Last Post

- Replies
- 1

- Views
- 465

- Last Post

- Replies
- 3

- Views
- 445

Linear Algebra
I need a book on linear algebra....

- Last Post

- Replies
- 2

- Views
- 535

- Last Post

- Replies
- 2

- Views
- 583

- Last Post

- Replies
- 11

- Views
- 377

- Last Post

- Replies
- 15

- Views
- 452

- Last Post

- Replies
- 5

- Views
- 903

- Replies
- 13

- Views
- 621