# Dimension of a vector space

1. Aug 3, 2011

### 83956

1. The problem statement, all variables and given/known data
Let F be a field. Prove that the set of polynomials having coefficients from F and degree less than n is a vector space over F of dimension n.

2. Relevant equations

3. The attempt at a solution
Since the coefficients are from the field F, the are nonzero. So, if the polynomial has degree less than zero, its degree is at most n-1. Thus, the basis vector can be written a 1, u, ..., u^(n-1). Right? So then I should show that this basis vector is indeed a basis vector by showing that it spans and is linearly independent in order to conclude that the vector space has dimension n? I'm not sure how to go about that part.

2. Aug 3, 2011

### stringy

Are you familiar with the concept of a vector space isomorphism?

3. Aug 3, 2011

### 83956

The only definition in my book states:

A linear map from a vector space V to a vector space W is said to be an isomorphism if it is one-to-one and onto. We say that V is isomorphic to W if there is an isomorphism from V to W. If v1, v2, ... vn is a basis for a vector space V over F, then the correspondence (a1, a2,...an) <-> a1v1 + a2v2 +...+anvn

4. Aug 3, 2011

### stringy

OK, do you have any experience in working with them?

Don't fret if you don't, we can answer this question the "long way." It's just that using an isomorphism would answer this question quickly. And actually I should quit using the words 'vector space isomorphism' since we haven't proved the set of polynomials is a vector space. Yet.

5. Aug 3, 2011

### 83956

Isomorphism still is kind of a difficult topic for me to pick up, however, I won't pick it up if I don't use it. So let's go that route, as I assume that is what my instructor may be looking for.

6. Aug 3, 2011

### stringy

Hmmmm... as I started typing I changed my mind. We can talk about isomorphisms afterwards if you would like. We'll do it the longer way first just to get used to working with vector spaces and the associated axioms.

First, our definition: A vector space is a set V over a field F. The elements of V are called 'vectors.' The elements of the field are called 'scalars.' The set V also comes equipped with two operations: vector addition and scalar multiplication. We require these two operations to satisfy several properties (they are certainly in your book). Associativity, inverses, etc.

Let P denote our set of polynomials. If we're going to show that this is a vector space over F, we need to define our operations. So, how are you going to define the sum of two vectors (polynomials)? How will you define scalar multiplication? (These are easy questions, don't think too deeply. )

Once you have those operations defined, we need to show that those properties are satisfied...

7. Aug 3, 2011

### 83956

Ok, so take two polynomials:

anxn+an-1xn-1+...+a1x+a0

bnxn+bn-1xn-1+...+b1x+b0

and sum: (an+bn)xn+(an-1+bn-1)xn-1+...+(a1+b1)x+(a0+b0) where each ai+bi is a unit

take "s" to be an integer, so we can multiply: s(anxn+an-1xn-1+...+a1x+a0) =sanxn+san-1xn-1+...+sa1x+sa0

8. Aug 3, 2011

### stringy

Perfect. You do need to be careful with language though. EDIT: I mean, don't assume that $s$ is an integer (the integers are not a field), it's a scalar. And the word "unit" has a very specific meaning in algebra, so try to avoid using that word unless you are actually talking about units.

Anyways, now to the vector space axioms: we need to show that vector addition makes the set into an abelian group. In other words, you need to show that the set is closed under addition, that addition is both associative and commutative, that an additive identity exists, and that additive inverses exist. Remember, these properties will be listed in your book. And note also that you're going to be using the properties of the underlying field. We need to be conscious of these.

And then we have four more to check...

Last edited: Aug 3, 2011
9. Aug 3, 2011

### 83956

This is closed under addition because each ai & bi are coefficients (scalars) of the field, so their sum will be a coefficient of the field.

Likewise, bi+ai is commutative since they are both coefficients of the field.

Addition is associative: anxn+an-1xn-1+...+a1x+a0 + (bnxn+bn-1xn-1+...+b1x+b0 + (cnxn+cn-1xn-1+...+c1x+c0) = (anxn+an-1xn-1+...+a1x+a0 + bnxn+bn-1xn-1+...+b1x+b0) + cnxn+cn-1xn-1+...+c1x+c0 ... in more expanded form which is difficult to do on here, but, anyway since ai, bi, and ci are coefficients of the field, thier sums are coefficients of the field.

Additive identity exists: 0 + (anxn+an-1xn-1+...+a1x+a0) = (anxn+an-1xn-1+...+a1x+a0)

And then for multiplication:
((anxn+an-1xn-1+...+a1x+a0) + (bnxn+bn-1xn-1+...+b1x+b0)) * u = (anxn+an-1xn-1+...+a1x+a0)u + (bnxn+bn-1xn-1+...+b1x+b0)u ... and expand, where essentially, ai, bi, and u are coefficients of the field, so their product is a coefficient of the field.

... check for a(bu) =(ab)u where coefficient multiples are coefficients of the field

...1 * (anxn+an-1xn-1+...+a1x+a0) = anxn+an-1xn-1+...+a1x+a0

0*anxn+an-1xn-1+...+a1x+a0 = 0

10. Aug 4, 2011

### stringy

Yup. Since the sum of two scalars (two field elements) is still a scalar in F, then the sum of two polynomials with coefficients from F is another polynomial with coefficients from F. Good, good...

We should also probably mention closure under scalar multiplication.

Exactly.

Since the scalars are field elements and addition is associative in F, vector addition will be associative. You could probably condense everything a bit if you used summation notation. But you have the right idea.

The additive identity in the vector space is defined to be the polynomial with all coefficients equal to the additive identity in F (which is commonly denoted by zero). Add the two polynomials and note that 0 is the additive identity in F.

Are you letting u be a scalar from F? It's usually the other way around: use bold face for vectors and normal type for scalars. In any case, here you are showing that scalar multiplication is distributive over vector addition.

Expand the left hand side by first adding the polynomials and then multipling by the scalar. Show that this equals the right-hand side. This makes use of the distributive property of F.

Is u now a vector (polynomial)? This one is pretty straitforward to show.

This is showing that the multiplicative identity in F (usually denoted by 1) plays the part of a scalar multiplicative identity in the vector space. Multiply the polynomial by 1 and then note that 1 is the multiplicative identity for the coefficients.

This is the property that the additive identity in F when multiplying a vector goes to the zero vector (the additive identity in the vector space). Note that the zero on the left is a scalar and the zero on the right is a vector! They are not the "same" zero.

And does your book have this listed as an axiom? Hmmm.. This is actually a property that can be derived from the other axioms.

But we are missing an axiom. We need the other distributive property:

$$(c_1 + c_2) p(t) = c_1 p(t) + c_2 p(t),$$

where p(t) is a vector and the c's are scalars. After this, we will have shown that this set is indeed a vector space. All that is left after that is computing its dimension.

I'm going to check out for the night, but remember how dimension is defined: it is the number of vectors in any basis. The key word is "any." All bases for a finite-dimensional vector space have the same number of vectors. To show that it has dimension n, find a spanning set of n vectors that is also linearly independent. There is a very natural choice...think of the standard basis for coordinate space.

Last edited: Aug 4, 2011