# Basis of Set of functions

• Kraz
In summary: I just want to make sure, the 0 is a constant (the constant function) and the set of functions that make it are zero functions, right?Yes, that is correct. The 0 in this context is the zero function, which is a specific type of function that always outputs the number 0 regardless of the input. And the set of functions that make it are the zero functions, meaning all functions that output 0 for all inputs.

#### Kraz

Hello,

I am just doing my homework and I believe that there is a fault in the problem set.

Consider the set of functions defined by
V= f : R → R such that f(x) = a + bx for some a, b ∈ R

It is given that V is a vector space under the standard operations of pointwise
addition and scalar multiplication; that is, under the operations

(f + g)(x) := f(x) + g(x)
(λf)(x) := λf(x).

Consider the set S = {f1, f2} ⊆ V consisting of the vectors
f1 : R → R defined by f1(x) = 1
and
f2 : R → R defined by f2(x) = x.

(a) Show that the set S is a basis of V ; that is, show that S is a linearly independent
set which spans V . Also state the dimension of V .
(b) State the coordinates (f1)S and (f2)S of the vectors f1 and f2 with respect to
the basis S

, what is wrong is that if a set of linear functions is independat that would mean that Af1+Bf2=0 only has teh trivial solution A=B=0 as a solution but this is obviously not right as f1(x)=1 and f2(x)=x so A+Bx=0, has other solutions to it than the trivial, thus its not indpendant and S not a basis of V.

Kraz said:
A+Bx=0, has other solutions to it than the trivial

Give an example of a non-trivial solution.

You'd have to find an A and B such that g(x) = A + Bx is the constant function g(x) = 0 because the constant function g(x) = 0 is the zero vector in this space.

Stephen Tashi said:
Give an example of a non-trivial solution.

You'd have to find an A and B such that g(x) = A + Bx is the constant function g(x) = 0 because the constant function g(x) = 0 is the zero vector in this space.

what do you mean, I am not english so do not really understand, but as x is elemnt of R it can take any of such values, and so can the scalars, as g(x)=0, A+Bx=g(x) for example when A=2, B=2 and x=-1 ... and many more

Kraz said:
what do you mean, I am not english so do not really understand, but as x is elemnt of R it can take any of such values, and so can the scalars, as g(x)=0, A+Bx=g(x) for example when A=2, B=2 and x=-1 ... and many more

Suppose A = 2, B = 2. The function h(x) = 2 + 2x is not the function g(x) = 0. It is true that h(-1) = 0 , but h(x) is not zero for all values of x. The function h(x) is not the zero function.

Stephen Tashi said:
Suppose A = 2, B = 2. The function h(x) = 2 + 2x is not the function g(x) = 0. It is true that h(-1) = 0 , but h(x) is not zero for all values of x. The function h(x) is not the zero function.

yes but if I put a different x, there will always be different values of A and B such that that equation would equal 0, therefore.

could you maybe ellaborate your logic? Thanks

Do you mean that in order for the set to be not independant I would need to find a function h(x) with specific, unchanable non zero values for A and B such that it is 0 for all x?

Kraz said:
Do you mean that in order for the set to be not independant I would need to find a function h(x) with specific, unchanable non zero values for A and B such that it is 0 for all x?

Yes.

Stephen Tashi said:
Yes.

thanks, but let's say we have a set of vectors (v1,v2,v3...vn)

do you agree that if this set is dependant Av1+Bv2+Cv3...=0, then A could have different values that lead to the solution, obviously each time A changes the other sclars change to?

Kraz said:
thanks, but let's say we have a set of vectors (v1,v2,v3...vn)

do you agree that if this set is dependant Av1+Bv2+Cv3...=0, then A could have different values that lead to the solution, obviously each time A changes the other sclars change to?

Yes. I agree.

For example, (kA)v1 + (kB)v2 + (kC)v3 ... = 0, for any scalar k.

To use an example from this problem, let
p(x) = x + 1
q(x) = 2x + 2
r(x) = 3x + 3

The set of vectors {p(x),q(x),r(x)} is not linearly indepdent.

A(p(x)) + B(q(x)) + C(r(x)) = 0 (for all x) has solutions such as A = 5, B = -1, C = -1 and A = -2, B = 1, C = 0.

Kraz said:
that would mean that Af1+Bf2=0 only has teh trivial solution A=B=0 as a solution but this is obviously not right as f1(x)=1 and f2(x)=x so A+Bx=0, has other solutions to it than the trivial, thus its not indpendant and S not a basis of V.
To say that ##Af_1+Bf_2=0## is to say that ##(Af_1+Bf_2)(x)=0## for all real numbers ##x##. (Note that the 0 in the first equation is a function, and the 0 in the second equation is a number). So if ##Af_1+Bf_2=0##, then for all real numbers ##x##, we have
$$0=(Af_1+Bf_2)(x)=Af_1(x)+Bf_2(x)=A+Bx.$$ Do you understand why this implies that ##A=B=0##? If you don't, then you should think about it until you do. (Hint: Think about what "for all" means).

Last edited:
Fredrik said:
To say that ##Af_1+Bf_2=0## is to say that ##(Af_1+Bf_2)(x)=0## for all real numbers ##x##. (Note that the 0 in the first equation is a function, and the 0 in the second equation is a number). So if ##Af_1+Bf_2=0##, then for all real numbers ##x##, we have
$$0=(Af_1+Bf_2)(x)=Af_1(x)+Bf_2(x)=A+Bx.$$ Do you understand why this implies that ##A=B=0##? If you don't, then you should think about it until you do. (Hint: Think about what "for all" means).

Yes I understood, I did not really know that it must be valid for all x, which was a dumb mistake as I should have realized that by looking at this equation: (Af1+Bf2)(x)=0

is it then correct to say then that the coordinate (f1)s with respect to the basis S is the follwoing column matrix : (row 1,column1 :1-Bx )
( row 2 column 1;B )

and the coordinate (f2)s with respect to the basis S is the follwoing column matrix: ( row 1 coloum 1:0 )
(row 2 coloumn 1: 1)

The matrix of components of a vector ##v## with respect to the ordered basis ##(f_1,f_2)## is the matrix ##\begin{pmatrix}a\\ b\end{pmatrix}## such that ##v=af_1+bf_2##. When ##v=f_1##, this is ##\begin{pmatrix}1\\ 0\end{pmatrix}##, not ##\begin{pmatrix}1-Bx\\ B\end{pmatrix}##. (What is ##B## and ##x## here?)

Fredrik said:
The matrix of components of a vector ##v## with respect to the ordered basis ##(f_1,f_2)## is the matrix ##\begin{pmatrix}a\\ b\end{pmatrix}## such that ##v=af_1+bf_2##. When ##v=f_1##, this is ##\begin{pmatrix}1\\ 0\end{pmatrix}##, not ##\begin{pmatrix}1-Bx\\ B\end{pmatrix}##. (What is ##B## and ##x## here?)

well because f1(x)= 1 , f2(x)=x, so v=af1 + bf2 , where v=f1 1= a+bx, so for every x and every value of A,B the coordinates are (a= 1-bx , b=b)

Kraz said:
f1(x)= 1 , f2(x)=x,
OK, this is true for all x, because ##f_1## is the constant function that takes everything to 1 (the output is always 1), and ##f_2## is the identity function (the output is always equal to the input).

Kraz said:
so v=af1 + bf2 , where v=f1 1= a+bx
You seem to be saying that if ##f_1=af_1+bf_2##, then ##1=a+bx##. But why would it be, and what is this ##x## supposed to be?

Kraz said:
so for every x and every value of A,B the coordinates are (a= 1-bx , b=b)
I don't follow the argument at all now. What is A and B, and how is there an x involved in the coordinates?

Fredrik said:
OK, this is true for all x, because ##f_1## is the constant function that takes everything to 1 (the output is always 1), and ##f_2## is the identity function (the output is always equal to the input).

You seem to be saying that if ##f_1=af_1+bf_2##, then ##1=a+bx##. But why would it be, and what is this ##x## supposed to be?

I don't follow the argument at all now. What is A and B, and how is there an x involved in the coordinates?

I figured it out.

I have one last questions if I may ask:

Lets say you have 2 other vectors which are part of set T, these 2 vectors g1 and g2 can be expressed in terns of the vectors in a set of a basis called S. By considering the 2 × 2 matrix A = ((g1)S(g2)S) whose columns are the coordinates of g1 and g2 with respect to the basis S how can you deduce that the set T is also a basis of V.