Cleaning up vocabulary and concepts

JRPB
Messages
22
Reaction score
0
I'm a physics student (undergrad) studying Linear Algebra for the first time. I'm writing down my thought process, hoping that someone with more experience can verify my conclusions. I feel that the narration is more clear than my original attempt to present this as a series of questions.

"A vectorial quantity is one that has direction and magnitude". This is something I dragged along with me for some time. And it's not entirely correct.

Now I [think I] know that a vector is an object that lives in a linear space. The linear space itself is generated by a special set of of objects (a base) and it's defined over a broader category (a field, for example, say, the real numbers). The elements that make up the base belong to this "category". Some properties are particular of certain linear spaces and they don't necessarily translate or exist in other spaces.

In physics, what we call a vector (guy with a hat) is actually referring to the L.S. R^n; which happen to have a geometric interpretation with properties of their own: the distance between two points, the magnitude and direction of a vector. These properties become meaningless in spaces defined over different objects. For instance: the linear space of polynomials of nth degree, the linear space of nxm matrices, etc.

Is this a linear space?

Space: Robots
Base: different robot parts
Zero robot: special "partless" robot, added to better fit the requirements of linear spaces.

All the vectors living in that space are made up of linear combinations of parts; each guy defined by it's own unique combination. I could arbitrarily define some property called "type" as a function of the count of certain types of parts in a given robot (vector). With my newly defined property I can generate a subspace called: domestic robots, which by definition includes the zero robot. And so on...

And last but not least, one about notation. The "hat thingy" or boldface on vector notation is not mandatory as long as your variables are properly defined: by specifying which linear space they belong to. I understand why it makes life easier in physics to make that distinction very clear. I guess you have to place yourself in the proper context. In some cases, it doesn't make much sense to "hat" a vector (like one living in a polynomial space). So, in principle, I could drop the vector hat without being sloppy or informal; it's valid when the situation calls for it.

Also, from what I know, the concept of vector precedes linear spaces; and linear algebra, in general, borrows from lots of places (at least on the surface, I'm far from an expert). This is what makes me doubt my conclusions.

Unlearning things that you thought you knew is fun. The same thing happened to me when I started reading more serious calculus books (Spivak, Hasser, Courant, etc). I no longer "know" what a number is :smile:.

Thanks in advance.
 
Physics news on Phys.org
it seems like you get the idea of what an algebraic structure is like, but I don't think your example fully holds up. For a vector space to be a vector space, you need to have an underlying field ( of scalars ). Vector spaces are something that may be harder to generalize into everyday ideas, the underlying field requirement kind of forces you to be working with some kind of "number system" ( but the idea of an algebraic structure is not hard to generalize into everyday ideas at all! look up a group: http://en.wikipedia.org/wiki/Group_(mathematics)#Definition ). The fact that you need an underlying field may make vector spaces a troublesome structure to model robots with.
But look at the definition of a group, it is very general; you have a set. you only need an operation * between your elements, so that (a*b)*c = a*(b*c) , and you need an identity "unit element" ( call it e ) , so that a*e = a. Each element has an inverse ( so doing an action can be undone , i.e. a*a^-1 = e ( your "unit element" ). And of course, everything must be "closed" under this operation ( so that doing the operation to 2 things results in something that is still in your set )
The more general your algebraic structure is, the easier it'll be for you to find examples of such things. What is an example of a group? Well, think about the set of all permutations of a collection ( permutations are rearrangements of things, for example,adcb is a permutation of abcd ). You have an operation, ( specifically, the "permute"/flip things around operation) , you have an identity element ( the "do nothing"/"flip nothing| permutation/arrangement ) and you can see that the permute operation satisfies (a*b)*c = a*(b*c). that is, it bracketing permutations by order of precedence doesn't matter ( this is called associativity ).
We can get even more general than this. There is a thing called a monoid, which is like a group, but all you need is the associativity property, an identity element, and closure.

You can take words as a monoid. You have words as the elements of your monoid, and take for example, adding words together to be your operation ( that is, "bob" + "town" = "bobtown" ). You can see that this forms a monoid, with the identity operation being the "" word ( the empty word ).

welcome to algebra :) this is where you may come across the idea that mathematics is kind of like the science of everything; well, it's pretty much true!
 
Thanks a lot. Group theory is definitely something I'll start studying soon.

"For a vector space to be a vector space, you need to have an underlying field ( of scalars )." I didn't know that and it makes a big difference.

"[...]like the science of everything; well, it's pretty much true!" I was surprised to see how closely related computer science and linear algebra are.

Thanks again, wisvuze.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
Back
Top