Redundancy in definition of vector space?

In summary, a vector space V is a set endowed with two operations, addition and scalar multiplication, that satisfy eight axioms. One of these axioms, known as the additive inverse axiom, states that for every vector v in V, there exists a unique vector -v in V such that v+(-v)=0. This axiom is necessary to prove the statement -f = (-1)*f and to ensure that scalar multiplication by negative numbers is defined in V. While some may argue that this axiom is redundant since it can be derived from the other axioms, it is still important in providing a complete and concise definition of a vector space.
  • #1
Aziza
190
1
According to my book, a vector space V is a set endowed with two properties:
-closure under addition
-closure under scalar multiplication

and these two properties satisfy eight axioms, one of which is:
"for all f in V there exists -f in V such that f+(-f)=0"

But then isn't this axiom redundant in describing a vector space, since we already specified that V is closed under scalar multiplication? I mean, just by closure under multiplication, we know that if f is in V, -f must be in V since -f = (-1)*f, and (-1) is a scalar..
 
Physics news on Phys.org
  • #2
The statement:
-f =(-1)*f
is not trivial and requires a proof. The axiom is not redundant, it provides the basis for proving the above statement.
 
  • #3
I guess that's true. In more technical terms, if you assume that the set is only an abelian monoid, rather than an abelian group, it follows automatically that it is a group if you have the scalar field acting on it in this way (using multiplicative group action that is distributive over vector addition). I use these technical terms because the technical terms summarize all the axioms in my brain where they are stored in a more organized way, so that I can wrap my head around them more easily. If this doesn't help you now, it will if you study abstract algebra. In my mind, I think of a vector space as an abelian group with a field acting on it in a distributive way. That's a lot easier to remember than 8 axioms. However, it doesn't really make sense unless you know the basic definitions in abstract algebra. But, then, I don't think the abstract concept of a vector space really makes that much sense until then, anyway. Until then, I don't think you should worry too much about the exact definition. Better to have all the examples in mind and intuitively think that a vector space is just a place where you can add vectors and multiply them by scalars, plus, having a few examples in mind, particularly ℝ^n and ℂ^n.

Incidentally, it's weird to me that the word "closure" would be used at all in the definition. Closure presupposes there is an addition operation and a multiplication operation. Usually, the term closure is reserved for subspaces, subgroups, subrings, etc. Sub-stuff. Closure is the condition for a subset to be a vector space or some other algebraic gadget.
 
  • #4
Useful nucleus said:
The statement:
-f =(-1)*f
is not trivial and requires a proof. The axiom is not redundant, it provides the basis for proving the above statement.

But "closure under scalar multiplication" means you can multiply by negative numbers as well...
so are you saying that without this axiom, multiplication by negative numbers is not defined? So we know that (-1)*(f) exists and is in V, but we do not know it is -f without this axiom?
 
  • #5
Oh, and I forgot to say, few people really care about having the fewest possible axioms. Some of the usual axioms for a group are redundant. No one cares. Well, more or less no one.
 
  • #6
The statement:
-f =(-1)*f
is not trivial and requires a proof. The axiom is not redundant, it provides the basis for proving the above statement.

No, it is redundant. I worked it out. If you don't assume additive inverses for the vector addition, you get it for free, assuming all the other axioms. The OP is right. But again, no one cares.
 
  • #7
Okay, let's work it out.

Assume the following:

(1) (kl)v = k(lv)
(2) 1 v = v
(3) (k+l)v = kv + lv

where k and l are understood to be any scalars and v is any vector.

Assuming these, we have

v+ (-1)v = 1v + (-1)v by (2)

= (1+ -1)v by (3)

= 0 v

It remains to show that 0 v is the 0 vector.

0 v + v = 0 v + 1 v = (0+1)v = 1v = v

But identities are unique in a monoid, so 0 v is the zero vector.

Thus, -1 v is an additive inverse for v with respect to vector addition.

QED

No need to assume additive inverses exist. But you don't gain a lot by going through this argument, just to show that it's redundant, and that's why no one bothers with it.
 
  • #8
homeomorphic said:
Okay, let's work it out.

Assume the following:

(1) (kl)v = k(lv)
(2) 1 v = v
(3) (k+l)v = kv + lv

where k and l are understood to be any scalars and v is any vector.

Assuming these, we have

v+ (-1)v = 1v + (-1)v by (2)

= (1+ -1)v by (3)

= 0 v

It remains to show that 0 v is the 0 vector.

0 v + v = 0 v + 1 v = (0+1)v = 1v = v

But identities are unique in a monoid, so 0 v is the zero vector.

Thus, -1 v is an additive inverse for v with respect to vector addition.

QED

No need to assume additive inverses exist. But you don't gain a lot by going through this argument, just to show that it's redundant, and that's why no one bothers with it.

haha well my teacher is saying it is not redundant, so I wanted to see who is right! :D
 
  • #9
Very nice! However, I still believe that the axiom is not redundant. To clarify things, let me state the axiom again:
"For each vector v in the vector space V there is a "unique" vector -v in V such that
v+ (-v) = 0."

homeomorphic showed that:
-v = -1* (v)
but this does not show the uniquenss of the additive inverse. I will give it a try and see if I can prove the uniqueness of the additive inverse without the axiom.
 
  • #10
Probably it is better to say that homeomorphic showed that there exists an additive inverse but did not show it is unique.
 
  • #11
Aziza said:
and these two properties satisfy eight axioms

It isn't the "properties" that satisfy axioms, its the "operations" of multiplication and addition that may satisfy axioms. The technical details depend on what the axioms say.
 
  • #12
Useful nucleus said:
Very nice! However, I still believe that the axiom is not redundant. To clarify things, let me state the axiom again:
"For each vector v in the vector space V there is a "unique" vector -v in V such that
v+ (-v) = 0."

homeomorphic showed that:
-v = -1* (v)
but this does not show the uniquenss of the additive inverse. I will give it a try and see if I can prove the uniqueness of the additive inverse without the axiom.

Suppose w1 and w2 are additive inverses of v. Thus
##v+w_1=0##
##v+w_1+w_2=w_2##
but since w2 is an additive inverse of v, we have
##w_1=w_2##
QED
 
  • #13
The way I learned it.

For all v in V, there exists w in V such that v+w=0.

Then you prove that w=-v.
 
  • #14
homeomorphic said:
It remains to show that 0 v is the 0 vector.

0 v + v = 0 v + 1 v = (0+1)v = 1v = v

But identities are unique in a monoid, so 0 v is the zero vector.

If you want to prove that [itex]0v[/itex] is [itex]0[/itex], you should prove that [itex]0v+x=x[/itex] for all vectors [itex]x[/itex]. How do you accomplish that?
 
  • #15
homeomorphic said:
0 v + v = 0 v + 1 v = (0+1)v = 1v = v

But identities are unique in a monoid, so 0 v is the zero vector.
Not true, according to the standard definition of a monoid. There are (nontrivial) monoids for which a + a = a for all a, for example. You need a cancellation law for vector addition ( a + b = a + c --> b = c ). This condition is weaker, for monoids, than the existence of inverses, but it is not stated among the vector space axioms.

This still doesn't prove that the axiom of inverses is not redundant, but my guess is that it is, and I will try to construct an example of structure not satisfying this axiom but all the other vector space axioms...

I'll be back...
 
Last edited:
  • #16
Erland said:
This still doesn't prove that the axiom of inverses is not redundant, but my guess is that it is, and I will try to construct an example of structure not satisfying this axiom but all the other vector space axioms...

I'll be back...
It turns out to be easy to find such examples. Here is perhaps the simplest of them all:

Put V={0,1} and let F be an arbitrary field (for example, the field of real numbers, R). F will be our field of scalars.

We define an addition operation on V by 0+0=0 and 0+1=1+0=1+1=1. We use 0 as "zero vector" here, which makes sense since 0+0=0 and 1+0=1. Also, notice that x+x=x, for all x in V.

We define "multiplication of scalar and vector" simply by rx=x, for all r in F and x in V. Notice that in particular, 01=1.

Now, it is easy to verify that all the vector space axioms are satisfied by this structure, except the existence of inverse vectors, which is false since there is no x in V such that 1+x=0.

The axiom of existence of inverse vectors is therefore NOT redundant.

We could replace it with a cancellation law, which is apparently (but in this case not really) weaker, but this would be rather pointless.

I am not sure whether or not any of the other vector space axioms are redundant, though.
 
  • #17
Erland said:
I am not sure whether or not any of the other vector space axioms are redundant, though.
It turns out that the axiom of commutativity of vector addition (x+y=y+x) is redundant.

The axioms involving multiplication with scalars are essential in proving this commutativity from the other axioms. It would not be redundant if we had the additive structure only, since there are many nonabelian groups.
 
Last edited:
  • #18
Erland said:
The axiom of existence of inverse vectors is therefore NOT redundant.

We could replace it with a cancellation law, which is apparently (but in this case not really) weaker, but this would be rather pointless.
We could also replace it with the condition

0v=0, for all v in V,

because then, homeomorphic's attempted proof will work. This could perhaps be considered as a simpler condition than the existence of additive inverses.
On the other hand, this proof depends upon the axioms involving multiplication with scalars, while if we keep the additive inverses axiom, the axioms involving vector addition only are the axioms for an abelian group. Thus the addition axioms can be summarized by just stating that V is an abelian group under addition.
 
  • #19
Demanding

[tex]0v=0[/tex]

and

[tex]1v=v[/tex]

as axioms would look pretty natural to me. :!)
 
  • #20
jostpuur said:
Demanding

[tex]0v=0[/tex]

and

[tex]1v=v[/tex]

as axioms would look pretty natural to me. :!)
...and then we wouldn't need -v as a basic notion, but we can define it as -v=(-1)v, and then easily prove that v+(-v)=0, as Aziza did.

Since students usually learn linear algebra before abstract algebra, and therefore don't know what an Abelian group is. this axiomatization could actally be better for them than the standard one,
 

1. What is redundancy in definition of vector space?

Redundancy in definition of vector space refers to the existence of multiple ways to define a vector space, where one definition may contain redundant or unnecessary information compared to another definition.

2. Why is redundancy in definition of vector space important?

Redundancy in definition of vector space allows for flexibility and convenience in defining vector spaces. It also allows for different perspectives and approaches to understanding vector spaces, which can be useful in certain contexts or for different applications.

3. How can redundancy in definition of vector space be avoided?

To avoid redundancy in definition of vector space, one can carefully select the necessary and minimal set of axioms or properties to define a vector space. This can be done by considering the most general and fundamental concepts that are necessary for a vector space to exist.

4. Can redundancy in definition of vector space lead to contradictions?

No, redundancy in definition of vector space does not necessarily lead to contradictions. However, it may lead to confusion or unnecessary complexity in understanding and working with vector spaces.

5. Are there any benefits to having redundancy in definition of vector space?

Yes, redundancy in definition of vector space can provide a deeper understanding of the concept and allow for more flexibility in its applications. It can also make it easier to extend the concept to more complex structures or situations.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
2K
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
6
Views
755
  • Linear and Abstract Algebra
Replies
3
Views
178
  • Differential Geometry
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
957
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
8K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Back
Top