Linear Algebra Texts Recommended for Rigorous Approach

  • Thread starter JasonRox
  • Start date
In summary, the text we are using sucks in my opinion. I would like a more rigorous approach to linear algebra. I recommend "Linear Algebra Done Right" by Sheldon Axler.
  • #1
JasonRox
Homework Helper
Gold Member
2,386
4
I am doing Linear Algebra right now, but the text we are using sucks in my opinion.

Elementary Linear Algebra /with Applications - Anton and Roberts

If there exists a linear algebra text that has a Spivak (Calculus) style approach, that would be great. I would like a more rigorous approach to linear algebra.

So, what text do you recommend?

Thanks.
 
Physics news on Phys.org
  • #2
You might want to check out "Linear Algebra Done Right" by Sheldon Axler.
 
  • #3
Sounds like a Linear Algebra textbook book tag along kind of thing.

Is it like Schaum's or what not?
 
  • #4
Not at all. I suggest you try to find a copy and not judge it by it's title.

Axler's is the closest analogue in the linear algebra world I've seen to Spivak's Calculus.
 
  • #5
I used Shilov's . They're pretty decent.
 
Last edited by a moderator:
  • #6
Thanks guys.
 
  • #7
Also, I found rather fun:

http://ocw.mit.edu/OcwWeb/Mathematics/18-06Linear-AlgebraFall2002/VideoLectures/index.htm
 
Last edited by a moderator:
  • #8
the best standard linear algebra text is probably hoffman and kunze. anything by serge lang tends to be well done, but osme of his lineat algebra texts are bend over backward dumbed down.

"finite dimensional vector spaces" by paul halmos is probably closest to a spivak type text.

or just get a good algebra book that focuses on, linear algebra , like the excellent book Algebra, by michael artin.

there is also an excellent book by paul detman, in paperback for a few dollars, in dover. anythoing written in the 60's as spivak was, will have hgiher stahndards thahn tdays c**ppy texts.
 
  • #9
Yes I too would also recommend Sheldon Axler's "Linear Algebra Done Right." It is much more rigorous than your standard intro to linear algebra text. You hardly use matrices and matrix manipulations at all. Determinants of matrices are explained in one of the very last chapters of the book. Axler explains the THEORY behind a lot of the linear algebra most students take for granted in their first course in LA.
 
  • #10
Is Shaums good?
 
  • #11
schaum's books are always useful, utilitarian collectioins of facts and exercises. they are never, ever, anywhere near the category of book requested here, i.e. a serious theoretical text like spivak's calculus. They are pretty good at what they attempt, which is a minimal acquaintance with a subject, usually old fashioned, and often out of date.

even their value as a problem solving source to me seems compromised in recent years as they get thicker and less challenging, responding to the new penchant for dumbing down the material.

after years of recommending them, i found the schaum's calculus book almost useless this year, and regretted recommending it to my honors class.
 
  • #12
the algebra books 843,844,845, listed for free on the website http://www.math.uga.edu/~roy/

are fantastic, marvellous works of magnificent scholarship. Anyone who even holds them in his/her hand magically becomes algebraically literate.

just as the prey caught in the mouth of the tiger is doomed to never escape, so the student who once gazes into the pages of these works will be forever tied into the community of algebra scholars.
 
  • #13
I got a new text by Friedberg, Spinsel and some other guy.

Seems really good so far.
 
  • #14
Does one know of a good linear algebra text with an emphasis on applications and MATLAB. Thanks.
 
  • #15
quasi426 said:
Does one know of a good linear algebra text with an emphasis on applications and MATLAB. Thanks.

Linear Algegra by Anton Roberts has lots of great applications. In fact, a whole chapter is devoted to applications.

He has some emphasis on the use of programs like MATLAB, but he doesn't show how to use the program.
 
  • #16
Is this Anton and Rorres? That's the book I used as an undergrad. It was decent--I still refer to it from time to time...
 
  • #17
JasonRox said:
I got a new text by Friedberg, Spinsel and some other guy.

Seems really good so far.
:smile: Friedberg, Insel, and Spence.
 
  • #18
gravenewworld said:
Yes I too would also recommend Sheldon Axler's "Linear Algebra Done Right." It is much more rigorous than your standard intro to linear algebra text. You hardly use matrices and matrix manipulations at all. Determinants of matrices are explained in one of the very last chapters of the book. Axler explains the THEORY behind a lot of the linear algebra most students take for granted in their first course in LA.
Can that be good? The book I used, "Linear Algebra with Applications" by Robert Lay, proved nearly everything but also had a good early focus on matrices and matrix manipulations. These manipulations I found very useful in understanding how proofs later in the book proceed. Doing a lot of them helps make it all more concrete and gives you a sense for what you're actually proving. Without determinants you probably can't even do eigenvalues well. The book seemed rigorous, not that rigor in basic linear algebra is so difficult. Once you've mastered the basic manipulation there are ample proofs for you to do in the problems. What did it leave out? Well, there was a theorem or two (I don't remember exactly which) later in the book that it said to consult more advanced texts on rather than proving it itself. Also most of the book focused on real-valued matrices, though I don't know if that would be different in any other introductory text. On the whole the book was a wonderful guide to a course I loved enough to do most of the rest of the book since the course ended.

This is the _only_ book that I have had for linear algebra, and also I want to stress that the teacher I had was a very good lecturer. But I think the book contributed a lot, and maybe this weekend I'll finish the part on quadratic forms.
 
  • #19
0rthodontist said:
Can that be good?

Yes, very. Linear algebra should not equal solving systems of equations, and I think a good motivated student will be able to start with the interesting parts of linear algebra from the start. Axler's might be difficult for most as a first course (he suggests it for a second), but I think is a good book to look at for someone who asked for a Spivak analogue (see OP).

0rthodontist said:
Without determinants you probably can't even do eigenvalues well.

See http://www.axler.net/DwD.html it's a paper published by Axler on a determinant free approach to linear algebra that his text is partly based on.
 
  • #20
Well, at that link he says it is intended for a _second_ course in linear algebra.

I don't see why you don't like the manipulative approach. After all in applied math that's what the computer is doing. How can you understand LUP decomposition without doing row reduction yourself? I can't tell you how many times I've successfully thought through a proof in Lay's book by thinking about the nitty gritty rows and elements. Probably half of Lay's own proofs take that route. Also I especially like determinants because of their concrete area/volume/etc. interpretation.

It's not as if you're spending the whole course just manipulating matrices, you just spend a few weeks doing manipulation and that gets you an intuition you can use.

I think that all math should be grounded in some kind of practical, almost physical manipulation skill, something that let's you visualize what is going on when it gets abstract. I doubt we would even have any calculus or geometry without the manipulation in two and three dimensions that we do every day going about our lives.
 
Last edited:
  • #21
0rthodontist said:
Well, at that link he says it is intended for a _second_ course in linear algebra.

Yes, I mentioned this above. The text is self contained though, and does not rely on any prior linear algebra or matrix knowledge.

0rthodontist said:
I don't see why you don't like the manipulative approach. After all in applied math that's what the computer is doing. How can you understand LUP decomposition without doing row reduction yourself?

What you're calling "applied math" here is what I'd call "trivial", or "exceedingly dull computations". I'm not saying that these dull proceedures shouldn't be taught (mathematicians will have to teach them to science students one day at any rate), but I do feel your typical intro course spends way to much time on computations that really aren't needed for the more interesting topics.

0rthodontist said:
I can't tell you how many times I've successfully thought through a proof in Lay's book by thinking about the nitty gritty rows and elements. Probably half of Lay's own proofs take that route. Also I especially like determinants because of their concrete area/volume/etc. interpretation.

After pushing all these little indicies and sums around, how many of these proofs do you actually understand beyond the technical "line n follows from line n-1 follows from line n-2..."? These kinds of proofs are rarely enlightening.

Take a look at the "Down with Determinants" paper I linked to to see a different approach to things. He introduces determinants for the very purpose of volumes near the end, but to me it's much more motivated than the usual cofactor expansion that leaves students befuddled. Read it and let me know if it doesn't increase your understanding of linear algebra.Do keep in mind the context of my recomendation of Axler's text. I know full well the moaning that your typical intro linear algebra class produces when you do anything at all that isn't in the form of a 'concrete' matrix. However, Jason isn't the typical student- he's complained many times about the low level of his courses and he seems like he's genuinely interested in learning math, not just some number crunching so I have no qualms about suggesting this text to him if he is intersted in learning some linear algebra. He'll learn enough of the other dull junk in whatever classes his university makes him take.
 
  • #22
shmoe said:
After pushing all these little indicies and sums around, how many of these proofs do you actually understand beyond the technical "line n follows from line n-1 follows from line n-2..."? These kinds of proofs are rarely enlightening.

This is so true. I'm glad someone else said it!

shmoe said:
He'll learn enough of the other dull junk in whatever classes his university makes him take.

OTOH, a lot of these classes are what you make of them. Many people are bored in linear algebra class, but it is maybe the most useful math class they ever take. I've heard some people refer to dynamical systems as boring too. This kind of stuff amazes me...

Yeah, solving 3x3 systems gets old. But that isn't the point of these courses (or it shouldn't be anyway)...
 
  • #23
shmoe said:
After pushing all these little indicies and sums around, how many of these proofs do you actually understand beyond the technical "line n follows from line n-1 follows from line n-2..."? These kinds of proofs are rarely enlightening.
I understand almost all of it. The "line n follows from ..." type stuff is how at this time I perceive the purely abstract algebra, meaningless fiddling with symbols. My opinion may change when I take a course in abstract algebra, or it may not, but at any rate I can picture how the operations are actually performed and this leads easily to proof.

This reminds me of something I posted a while ago, https://www.physicsforums.com/showthread.php?t=106101. It's an example where I didn't actually manage to get the proof, but it happened to be a very simple idea that you could never get without thinking in terms of row operations.

Do keep in mind the context of my recomendation of Axler's text. I know full well the moaning that your typical intro linear algebra class produces when you do anything at all that isn't in the form of a 'concrete' matrix. However, Jason isn't the typical student- he's complained many times about the low level of his courses and he seems like he's genuinely interested in learning math, not just some number crunching so I have no qualms about suggesting this text to him if he is intersted in learning some linear algebra. He'll learn enough of the other dull junk in whatever classes his university makes him take.
Oh, so interest in actual manipulative skill makes you somehow "not genuinely interested in math"? And I suppose holding a driver's license disqualifies you as an automotive engineer.
I said I loved my course on linear algebra and I do. My average in that class was above 100%, and a few weeks ago I looked at a few problems in linear algebra from the graduate qualifying exam at my school and found that I could do them.
 
Last edited:
  • #24
0rthodontist said:
I understand almost all of it. The "line n follows from ..." type stuff is how at this time I perceive the purely abstract algebra, meaningless fiddling with symbols. My opinion may change when I take a course in abstract algebra, or it may not, but at any rate I can picture how the operations are actually performed and this leads easily to proof.

"abstract algebra" is really nothing like what they call algebra in high school.

Of course it's sometimes unavoidable to have to look at specific entries of matrices or do tedius calculations. If this adds little to understanding and can be avoided, I think it should be.

(aside-for your stochastic matrix, you were happy showing you had an eigenvalue of 1. This is obvious if you look at the transpose)

0rthodontist said:
Oh, so interest in actual manipulative skill makes you somehow "not genuinely interested in math"?

I said nothing like this implication, please read again. Interest in computations or interest what I consider maths do not necessarily exclude one another.
 
  • #25
0rthodontist said:
I understand almost all of it. The "line n follows from ..." type stuff is how at this time I perceive the purely abstract algebra, meaningless fiddling with symbols. My opinion may change when I take a course in abstract algebra, or it may not, but at any rate I can picture how the operations are actually performed and this leads easily to proof.

Don't let the fact that you've no experience of what "abstract algebra" is stop you from making sweeping and dismissive statements about it then.

This reminds me of something I posted a while ago, https://www.physicsforums.com/showthread.php?t=106101. It's an example where I didn't actually manage to get the proof, but it happened to be a very simple idea that you could never get without thinking in terms of row operations.

that has an elementary proof in 'purely abstract' terms, as shmoe points out. If the columns sum to one then (1,1,1..,1) is a left eigenvector of eigenvalue 1 (ie an eigenvalue of A^t) hence the answer.
Oh, so interest in actual manipulative skill makes you somehow "not genuinely interested in math"?

sometimes it is necessary to get ones hands dirty to figure out why something is true, perhaps by verifying a proposition for a few cases to see how a proof might run in general, however an 'interest' in manipulation could be take to mean something entirely different.

And I suppose holding a driver's license disqualifies you as an automotive engineer.

I don't see how that is remotely justifiable as an analogy. Perhaps (and this is genuinely tongue in cheek) a better analogy would be 'the ability to change a tyre doesn't make you and automotive engineer'. In anycase, I don't think you've understood shmoe's position.
I said I loved my course on linear algebra and I do. My average in that class was above 100%,

and who says there no such thing as grade inflation these days?

and a few weeks ago I looked at a few problems in linear algebra from the graduate qualifying exam at my school and found that I could do them.

So you presumably understand what a quotient space is then?

There are exactly, what, two things (ie proofs) one needs to be taught in linear algebra:

dim(U+V)=dim(U)+dim(V)-dim(UnV)

and for a linear map f:M-->M, then

M=Im(f) +Ker(f)

where the sum is direct.

Ok, probably throw in Sylvester's Law of replacement, call it 3 results.

Finally, to give you some idea of why proper theorem/proof stuff is admirable and necessary in linear algebra, try to show that det(AB)=det(A)det(B) for an arbitray nxn matrix. This is remarkably straight forward if we use the fact that det(M) (M a linear map from V to V) is the unique number d such that the induced map

M:Lambda^n(V)--> Lambda^n(V)

is mutliplication by d on the top power of the exterior algebra.

How about this, then:

let S_n the permutation group act on some vector space V of dimension n by permuting some basis. Show that the subspace L=e_1+e_2+..+e_n is invariant (e_i the basis vectors) and hence that the quotient V/L is also S_n invariant. (these are called representations of S_n). Try to write out a basis for V/L too.
 
  • #26
Suddenly I'm in an argument with everyone... Well, I'd like to remove myself from this thread.

Shmoe, of course I know that abstract algebra is "really nothing like what they call algebra in high school." I would like you to stop insulting me. I am taking an honors course in formal language theory.

Matt, I qualified my statement about abstract algebra. I have done a very small amount of it in beginning discrete math type settings, and found it very boring and pedantic, but I already said that when I know more I may change my opinion. Maybe I should have withheld my opinion.

Perhaps there is also an easy proof in abstract terms (terms I have never heard of) but the fact is there also exists an easy proof in concrete terms.

Proof that det(AB) = det(A)det(B):
(with A and B nxn)
Assume that A has rank <n. Then AB has rank <n, and det(AB) = 0 = det(A)det(B).
So I only need to prove the case when A has rank n. In this case, let A be a product of elementary matrices and the identity, like this:
A = E_kE_(k-1)...E2E1I
So AB = E_kE_(k-1)...E2E1B

det(E1B) = det(E1)det(B) because E1 is the matrix for a row operation and by the determinant rules for row operations.
Assume det(Ej...E1B) = det(Ej...E1)det(B). Then
det(E(j+1)Ej...E1B) = det(E(j+1))det(Ej...E1)det(B)
So by induction det(AB) = det(Ek)det(E(k-1))...det(E1)det(B)
By the same principle proved in the induction, det(Ek)det(E(k-1))...det(E1) = det(EkE(k-1)...E1) = det(A)
So det(AB) = det(A)det(B)

Was this proof longer than yours? Yes, but the idea was easy to grasp and the rest was just details.

I cannot do your second problem because it includes terminology I have never heard of.

I really understand the value of proof and its centrality to math, I just think that it helps a lot in proof to have a good "feel" for the subject matter through manipulation of it. It's the spirit of experimental mathematics.

Anyway I'm out of this thread.
 
  • #27
You need to prove to me that every linear map is the product of elementary ones, ie you need to pick a basis. Moreover, you have not demonstrated that the answer you got was independent of the choice of decomposition into elementary matrices. Ie you have not proven that det(A)=prod(det(Ei) (this would not be too hard, actually).

This 'basis dependent' proof also sheds no light on what a determinant (volume) is.I think the point for me is that 'doing row operations' bear absolutely no relation to what 'interesting linear algebra' is, nor does it give any feel for why decent theorems are true. I'd also think it bears no relation to the type of maths that Jason will care about. Anything that fails to mention quotient spaces ought not to call itself a linear maths course (for mathematicians, which is Jason's viewpoint).

I can certainly not imagine any mathematician I know saying that this kind of stuff (row operations) was what made them want to do maths for a career. There are definitely things that people think of as being elegant in maths.

And finally, you asserted that your book (Lay) proved everything. If you're not happy with quotient spaces, then it certainly has proved everything that it is reasonable to expect to be taught in a course that Jason would benefit from.
 
Last edited:
  • #28
0rthodontist said:
Shmoe, of course I know that abstract algebra is "really nothing like what they call algebra in high school." I would like you to stop insulting me. I am taking an honors course in formal language theory.

I did not mean it as an insult, you haven't taken a course in abstract algebra and your perception of it is way off ("meaningless fiddling with symbols") so it looked to me like you had no clue yet what abstract algebra entails. Don't take this as an insult either, I obviously don't have much to judge your abstract algebra knowledge than a few sentences and there would really be nothing wrong with knowing next to nothing about a subject you haven't studied in depth.

0rthodontist said:
Perhaps there is also an easy proof in abstract terms (terms I have never heard of) but the fact is there also exists an easy proof in concrete terms.

Proof that det(AB) = det(A)det(B):

Again, I urge you to take a look at the approach in Axler's paper towards determinants. It's much more motivated in my opinion.

matt grime said:
I think the point for me is that 'doing row operations' bear absolutely no relation to what 'interesting linear algebra' is, nor does it give any feel for why decent theorems are true. I'd also think it bears no relation to the type of maths that Jason will care about. Anything that fails to mention quotient spaces ought not to call itself a linear maths course (for mathematicians, which is Jason's viewpoint).

Agreed. I found a copy of the text Jason picked up, Friedberg, et al., sitting on my shelf (I didn't even know I had it). At first glance it looks decent and you'll be at least somewhat pleased to know that quotient spaces at least make an appearance in the exercises (starting in the first chapter), which he will cetainly do now if he's paying attention here.
 
  • #29
shmoe said:
I did not mean it as an insult, you haven't taken a course in abstract algebra and your perception of it is way off ("meaningless fiddling with symbols") so it looked to me like you had no clue yet what abstract algebra entails. Don't take this as an insult either, I obviously don't have much to judge your abstract algebra knowledge than a few sentences and there would really be nothing wrong with knowing next to nothing about a subject you haven't studied in depth.



Again, I urge you to take a look at the approach in Axler's paper towards determinants. It's much more motivated in my opinion.



Agreed. I found a copy of the text Jason picked up, Friedberg, et al., sitting on my shelf (I didn't even know I had it). At first glance it looks decent and you'll be at least somewhat pleased to know that quotient spaces at least make an appearance in the exercises (starting in the first chapter), which he will cetainly do now if he's paying attention here.

The book is fine if you don't go through each chapter.

We are going through determinants right now. WHY? I don't know. This is my third linear algebra course, and I still haven't got much further than the first course.

They keep reviewing everything. It's god damn annoying. Sure I won't remember everything, but I know it's my responsibility to go over last year's work. For god sakes, we went over the definition of a basis. If a student doesn't know this, they should be forced to drop the course.
 
  • #30
Well, I said I'm out of this thread, and I am, but I don't think there's any harm in finishing up the purely mathematical discussion.
matt grime said:
You need to prove to me that every linear map is the product of elementary ones, ie you need to pick a basis. Moreover, you have not demonstrated that the answer you got was independent of the choice of decomposition into elementary matrices. Ie you have not proven that det(A)=prod(det(Ei) (this would not be too hard, actually).
I'm not absolutely sure what your objection is, but if you're asking how I know that A can be decomposed into Ek...E1I--well, we know A can be row reduced to I since it has rank n, and we know row reductions are reversible, so I can be changed through row operations back to A, which means A can be written Ek...E1I. The original proof shows that det(A) = prod(det(Ei)) because prod(det(Ei)) = det(prod(Ei)) (by the principle shown in the induction) = det(A).
 
Last edited:
  • #31
That was what you needed to do, now all you need to do is prove that determinants behave as you claim they do under elementary row operations. Why does adding row 1 to row 2 leave the determinant unchanged? I don't think it is obvious that it does.
 
  • #32
Proving the row operation properties of determinants is an entirely separate question.
 
  • #33
It is not entirely different, and is in fact absolutely equivalent to the stated problem (in the sense that if det is a multiplicative homomorphism, then this implies the row operation result, and the row operation result you have shown implies the det result for all matrices).

So your elementary method of using purely row and column operations requirs us to believe this result. I have no problem with appealing to a simpler result, but this result is equivalent to what we want to prove. Can you at least indicate how row and column operations behave properly with respect to determinants in your method?

I presume that you're using the definition of det as expanding by cofactors, although you haven't said, so it would also rely upon us assuming that this definition actually gave zero if the rank of A were less than n too, wouldn't it? And how do you justify that this expression of Det is the volume change factor?

You probably feel I'm unduly playing devil's advocate. Perhaps I am. Perhaps I am being completely unfair to you on this one; I certainly can't discount that possibility.
 
Last edited:
  • #34
Well, you know that Rank A < n implies that det A = 0 because if Rank A < n, A is row reducible to an upper triangular matrix with a 0 on the diagonal, whose determinant is 0, and row operations do not change nonzero determinants to zero determinants.

Proving the row operations properties is more difficult, and I can't figure it out myself. Lay proves them, though, by an induction on the size of A.
 
Last edited:
  • #35
It is terribly gross, but not conceptiually difficult, to fill in all the questions matt is asking.

A question to matt though, have you never seen the details of this sort of thing worked out? If not, stay that way. I had them forced on me in my first linear algebra course, and I've since had to force them on students, it's not enjoyable from either end.
 

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
12
Views
3K
  • Science and Math Textbooks
Replies
13
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Science and Math Textbooks
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
3K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Science and Math Textbooks
Replies
2
Views
1K
Back
Top