Help me pick the better textbook

1. Dec 12, 2004

daster

I just finished https://www.amazon.com/exec/obidos/.../102-8371051-2688163?_encoding=UTF8&v=glance" by G. Shilov, but I don't know which one to pick.

Here are their TOCs:

An Introduction to Linear Algebra - L. Mirksy
Part 1 - Determinants, Vectors, Matrices, and Linear Eqations
I. Determinants
II. Vector Spaces and Linear Manifolds
III. The Algebra of Matrices
IV. Linear Operators
V. Systems of Linear Equations and Rank of Matrices
VI. Elementary Operations and the Concept of Equivalence
Part 2 - Further Development of Matrix Theory
VII. The Characteristic Equation
VIII. Orthogonal and Unitary Matrices
IX. Groups
X. Canonical Forms
XI. Matrix Analysis
XII. Bilinear, Quadratic, and Hermitian Forms
XIII. Definite and Indefinite Forms

Linear Algebra - G. Shilov
I. Determinants
II. Linear Spaces
III. Systems of Linear Equations
IV. Linear Functions of a Vector Argument
V. Coordinate Transformations
VI. The Canonical Form of the Matrix of a Linear Operator
VIII. Euclidean Spaces
IX. Unitary Spaces
X. Quadratic Forms in Euclidean and Unitary Spaces
XI. Finite-Dimensional Algebras and their Representations

I'd appreciate any help. And if you pick one, could you please explain your choice? I'm also open to other suggestions, but please keep in mind the following:
1. I'm on a very limited budget
2. I'm only a high school senior, so nothing too advanced!

Thank you.

Last edited by a moderator: Apr 21, 2017
2. Dec 12, 2004

Atom-Go-Boom!

I think you should use the Saxon Math Algebra series. Its very good and explanes each area very nicly.

3. Dec 18, 2004

mathwonk

please disregard the suggestion to use any book by saxon. forgive me atomgoboom, but saxon's books have a very limited objective: just to give you enoiugh techincal manipulative skill to pass trivial standardized tests. they offer no insight and no motivation. over my objections, my son's fine private school used them for years until they finally deduced from experience that "students who studied from saxon didn't understand anything", and dropped them, but only after ruining the mathematics preparation of hundreds of their excellent (and average) students.

the only good thing about them is their emphasis on drill and repetition. unfortunately what is being repeated is unimportant. there is no use of the imagination, no insight into discovery, reasoning, or meaning. no one ever wants to become a mathematician after using them, as math is portrayed as ugly and boring.

these books exploited a weakness in the whole concept of standardized testing. indeed it seems true that they increased scores, which was merely a testament to the fact that standardized tests are trivial, useless, wastes of time in measuring understanding of a subject.

never learn math from someone who basically dislikes math. If you saw the Karate Kid, learning math from saxon is like learning karate at the "Y" instead of from Mr. Miyagi, assuming you don't want to get your head handed to you in college.

I am not familiar with the books you mention, but shilov is a famous mathematician and probably wrote a fine book. there are many excellent cheap books available used and in paperback, such as that by dettman. if you can find a copy of the old high school text written in the 60's by the school mathematics study group, out of yale, for high school students, try that.

there is no need to own the books. you can visit a neighborring college library, which is most likely to have such old smsg classics as i am mentioning. but mirsky and shilov are p[robably comparable, i.e. well written but somewhat expert oriented books from the past decades when math learning was taken seriously and intended for serious students who would work at it, and who knew something about geometry and proof.

another excellent book, by john kelley, for his tv class continental classroom, and very low level and yet sophisticated in its ambition, is modern algebra, or introduction to modern algebra, by john kelley.

with books as cheap as these, you cannot go wrong. these listinjgs are from abebooks.com

An Introduction to Linear Algebra (ISBN:0486615472)
Mirsky, L.
Price: US$2.78 [Convert Currency] Shipping: [Rates and Speeds] Add Book to Shopping Basket Book Details Book Description: Dover, 1982. Soft Cover. *~ Book in good condition! ~* s3r4b177. Bookseller Inventory #y15627 Bookseller: Booklibrary (Long Beach, CA, U.S.A.) Linear Algebra (ISBN:048663518X) Shilov, Georgi E.; Silverman, Richard A. (translator) Price: US$ 6.95 [Convert Currency]
Shipping: [Rates and Speeds]
Book Details

Book Description: Mineola, NY, U.S.A.: Dover Publications, Incorporated, 1977. Trade Paperback. Very Good. Bookseller Inventory #160109

Bookseller: Bookwagon (Ashland, OR, U.S.A.)

Introduction to Modern Algebra
John L. Kelley
Price: US$8.50 [Convert Currency] Shipping: [Rates and Speeds] Add Book to Shopping Basket Book Details Book Description: Pub by D. Van Nostrand, 1960. Trade PB. Official textbook for Continental Classroom. Moderate shelf wear, fading. Inside clean with good binding. VG. Bookseller Inventory #28818 Bookseller: Lawson Books (Bartlesville, OK, U.S.A.) School Mathematics Study Group - Mathematics for High School - Introduction to Matrix Algebra (rev. Ed.) Price: US$ 14.95 [Convert Currency]
Shipping: [Rates and Speeds]
Book Details

Book Description: USA: Yale University Press, 1960. Paperback. Near Very Good. 4to - 9¾" - 12" Tall. 231 pp. Mathematics study book. Bookseller Inventory #163524

Bookseller: BookBuyers OnLine (San Jose, CA, U.S.A.)

Last edited: Dec 18, 2004
4. Dec 19, 2004

mathwonk

here is a primer of linear algebra:

the idea is to study phenomena where the output varies in the same way as the input. e.g. derivatives. the derivative of a sum is the sum of the derivatives, and the derivative of a scalar multiple is that scalar multiple of the derivative.

thus if you know that sin' = cos and (cos)' = -sin, then you also know that

(asin+bcos)' = acos(x) - bsin. this gives you an infinite 2 dimensional family of solutions, generated by just two "independent" solutions to the same problem.

as a result you can systematize this process of taking derivatives for this family of functions into a "matrix" of numbers. i.e. we can represent asin+bcos, just by (a,b), and the fact that asin +bcos goes to -bsin + acos, by the 2x2 matrix with rows

[0 -1] and [1 0]. i.e. to get the two coefficients (-b,a) of the output -bsin + acos, you just dot the coefficient vector (a,b) of the input with the two row vectors [0 -1] and [1 0]. Many other operations have this same matrix. e.g. rotation of the plane aboiut the origin also takes vector sums to vector sums, and scalar multiples to scalar multiples. This matrix above also is the matrix for a 90 degree rotation clockwise.

so linear algebra and matrix theory throw out what the operations are and remembers only their properties, i.e. simply that L is lienar if L(x+y) = L(x)+ L(y) and L(cx) = cL(x).

Then we ask how much we can say about such operations. can we classify them?

all two dimensional spaces of inputs can be regarded as essentiaslly the same as the plane, and so all linear operations on all two dimensional spaces can be regarded as essentially operations on vectors in the plane. if there is any way to assign a notion of "length" to those objects in a 2 diemnsional space such that the given linear operation preserves that length, then in fact the linear operation is essentially equivalent either to a rotation, or to a reflection, or some combination of these. this is essentially the same as saying the matrix for the operation has its inverse equal to its transpose.

so there are two ideas in linear algebra, one is dimension, and the study of whether an operation preserves dimension or diminishes it, and the consequent theory of adding a general solution to the homogeneous problem Lx = 0, to any particualr solution of the special problem Lx=b to get a general solution to the specific problem, the second is the study of the structure of operations that preserve dimension, i.e. isomorphisms.

since the simplest linear operations are scalar multiplication, the first task of classification is to understand all those linear op[erations that are composed of sums of these simplest operations, the operations whose matrices are diagonalizable. in this situation each basis vector is acted on by the operation via scalar multiplication, and such basis vector are called eigenvectors. thats all they are, objects on which the operation acts via scalar multiplication. If there are enough of them to express all other vectors, then the operation is diagonalizable.

the usual condition guaranteeing this is some kind of notion of angle and then the assumption that the operation is angle preserving. over the reals this does not quite do it and you have to allow also rotations, but over the complexes rotations also are scalar multiples.

so there are two basic theorems in linear algebra. the most basic is the dimension theorem: if V is a space of dimension n, and L:V-->V is a linear operator, then the dimension of the objects y such that Lx=y has a solution, equals n minus the dimension of the space of those x such that Lx=0. then for any such y, if Lu=y say. and Lx=0, then Lx+u = y also, and all solutions of Lw=y have form u+somex with Lx=0.

the second basic theorem is that if the matrix of an operation on some n dimensional space is symmetric about the main diagonal, then there is a basis of eigenvectors.

in the tables of contents above, for mirsky and shilov, the sections called "canonical forms" are about finding the best way to decompose an operation into simpler ones, and the sections on "quadratic" or "hermitian" forms, are about real and complex notions of length and angles, so you can state and prove the results telling when that operation is diagonalizable.

the determinant is a very useful device for determining the eigenvectors of L, in a finite dimensional space. it yields a polynomial called the characteristic polynomial whose roots are those scalars that occur as the scalar multiplier for some eigenvector. i.e. there is a vector v such that Lv = cv if and only if c is a root of the characteristic polynomial.

so there are three levels of linear algebra. 1st they tell you about dimension and rank of matrices. second they define eigenvalues and tell you about operations whose matrices can be chosen to be diagonal. finally the third level tells you what the matrices can be chosen to look like even when they cannot be made diagonal. there are two answers to this last question.
1) over any field there exists a basis made up of blocks like v1,....vn, such that the operator acts "cyclically" on each block: i.e. such that Lv1 = v2, Lv2 = v3,.....Lvn-1 = vn, but then it breaks down and we can only say that
Lvn = some linear combination of the v1,...vn.

the associated "companion matrix" has all the first n-1 columns looking just like all zeroes except for one 1 below the main diagonal, and the last column is the coefficient vector of that arbitrary linear combination.

this is called "rational canonical form".

2) over an algebraically closed field, there is a basis made up of blocks of form v1,....,vn such that these are almost eigenvectors. i.e. for each such block there is a number c such that we do not quite have Lvi = cvi, but we do have Lv1 = cv1, then Lv2 = v1 + cv2, Lv3 = v2 + cv3, ...,Lvn = vn-1 + cvn.

then the corresponding matrix block has all c's on the diagonal, but also has 1's just above the main diagonal.

this form is called jordan canonical form.

that pretty much covers the entire linear algebra curriculum. the first hurdle is to understand the concept of dimension, or equivalently to understand linear independence.

the next chapter is multilinear algebra. i.e. quadratic and hermitian forms are actually bilinear forms, in that you multiply two vector together and get a number, in a somewhat symmetric way. the determinant is an n linear alternating form, in that you multiply n vectors in an n dimensional space together and get a number, in an antisymmetric way. then the general concept of multiplying together any finite number of vectors and getting a vector or number, without assuming symmetry or antisymmetry, is called multilinear algebra, or tensor algebra. there is thread devoted to families of these tensors, which come up in differential geometry and hence modern (i.e. post einstein) physics.

then in algebraic geometry and complex manifolds, these families of tensors, or tensor fields, are themselves considered as objects in a sheaf, and there are cohomology groups defined with coefficients in these sheaves, and this tool is called global linear algebra. cohomology groups give a way of mapping certain problems linearly into a space of obstructions, such that the solvable problems are the ones that map to zero. then dimension theory can sometimes give easy criteria for solvability of the problem, for instance if the space of obstructions has lower dimension than the space of problems. for instance on a compact riemann surface of genus g the space of obstructions to the mittag leffler problem has dimension g, so any system of polar data of degree greater than g admits a meromorphic fucntion having poles in that system.

Last edited: Dec 19, 2004
5. Dec 19, 2004

houserichichi

I'll add my two cents...my introductory book on linear algebra was by Friedberg (ISBN 0135370191) which I found to be very helpful and easy to read. It's pretty theoretical (as I hate applied math, so I lucked out), but take a look, see if you can find a copy in the library as the price was a tad high (at least from my university bookstore...buggers)

6. Dec 19, 2004

ComputerGeek

I liked the text book I had for My linear algebra class.

I dealt with all the foundational understanding that one will need for all aspects of Linear algebra and then just talked about the different applications and manipulations in Linear algebra.

the book is called: Linear Algebra and Its Applications by David C. Lay.

7. Dec 20, 2004

gazzo

If you're interested...

Gilbert Strang, (who wrote a book on Linear Algebra) has his lectures on linear algebra at MIT on the net and you can watch them for free:

http://ocw.mit.edu/OcwWeb/Mathematics/18-06Linear-AlgebraFall2002/VideoLectures/index.htm [Broken]

Hist book, and Anton's book (Elementary Linear Algebra) is quite good (the university here uses Anton's one as a standard text).

Is Shilov's book good ? He seems like a good author, I like his book on analysis

Last edited by a moderator: May 1, 2017
8. Dec 20, 2004

Cyrus

I had Lay too, its a good book, but basic at best.

9. Dec 20, 2004

I reccomend you check out those Strang lectures, they're good.

My textbook was "Elementary Linear Algerba" By Anton, and I didn't like it.

10. Dec 21, 2004

daster

Thanks for the replies!

I've already seen Strang's lectures. I've also read through his book and I wish I could get it, but it's too expensive right now. Same applies to Anton.

After putting much thought into this, I decided to get Mirksy's book in addition to Shilov's elementary analysis book. I hope I don't regret it!

Last edited by a moderator: Dec 21, 2004