Linear Independence and Linear Functions

For example, the function f(<x,y>)= <2x+ 3y, 4x+ 5y> would be represented by the matrix $\begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix}$. So you need to find 4 such matrices, each corresponding to a linear function, that are independent.
  • #1
catscradle
2
0
I need some help with examples. Especially number 2.

1) Name a subset which is closed under vector addition and additive inverses but is not a subspace of R squared.

I think I got this one. {(x,y) st x,y are elements of integers} because this isn't closed under scalar multiplication

2) Give an example of a function f: R squared --> R that satisfies f(av) = af(v) for all a in the element of R and all v in the element of R squared, but f is not linear.

I don't really know what a linear function is. I tried f=0V, but my teacher said that's wrong. Any advice?

3) Find a set of 4 linearly independent vectors in the vector space L(R squared, R squared)

I think {3x+1, 4x, 8x^2 - x, 12x^3 - x^2} is an answer.
 
Physics news on Phys.org
  • #2
catscradle said:
I need some help with examples. Especially number 2.

1) Name a subset which is closed under vector addition and additive inverses but is not a subspace of R squared.

I think I got this one. {(x,y) st x,y are elements of integers} because this isn't closed under scalar multiplication
Yes, that's correct. Very good!

2) Give an example of a function f: R squared --> R that satisfies f(av) = af(v) for all a in the element of R and all v in the element of R squared, but f is not linear.

I don't really know what a linear function is. I tried f=0V, but my teacher said that's wrong. Any advice?
A linear function (and if you were given this problem you were certainly expected to know what it is- perhaps you know them as "linear operators" or "linear transformations") is a function f defined on a vector space such that f(u+ v)= f(u)+ f(v) and f(av)= af(v) where u and v are vectors and a is a scalar. Since those are the operations on a vector space, linear functions "play nicely" with vectors and are the most important functions on vector spaces. f= 0V (f(v)= 0 for all vectors, v) is linear- that's why you teacher said that is wrong. Since f(av)= af(v) is one of the requirements for a linear function, it must be the other, f(u+ v)= f(u)+ f(v), that is violated. I'm not at all sure I can think of any good "hint" here. I'll just say, consider the function [itex]f((x,y)= ^3\sqrt{x^2y}[/itex].

3) Find a set of 4 linearly independent vectors in the vector space L(R squared, R squared)

I think {3x+1, 4x, 8x^2 - x, 12x^3 - x^2} is an answer.[/QUOTE]
What do you think L(R2,R2) means? It is the set of Linear functions from R2 to R2. (Which makes me very concerned about your statement "I don't really know what a linear function is"! Obviously your teacher thinks you should!)

Your examples are functions from R to R, not R2 to R2 and three of the four are not "linear". You need functions that take a vector <x,y> to another vector, something line f(<x,y>)= <fx(x,y), fy(x,y)> and, again, the functions must be linear. Are you at all familiar with representing linear functions as matrices? L(R2, R2) can be represented as the set of all 2 by 2 matrices.
 
Last edited by a moderator:

1. What is the definition of linear independence?

Linear independence refers to a set of vectors in a vector space that cannot be expressed as a linear combination of other vectors in the set. In other words, no vector in the set can be written as a linear combination of the other vectors.

2. How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use the linear independence test. This involves setting up a system of equations with the coefficients of the vectors as variables. If the only solution to the system is the trivial solution (all coefficients equal to 0), then the vectors are linearly independent.

3. What is the relationship between linear independence and linear functions?

Linear independence is important in the study of linear functions because it helps determine whether a set of vectors can form a basis for a vector space. A basis is a set of linearly independent vectors that span the entire vector space, and it is necessary for representing any vector in the space as a linear combination of the basis vectors.

4. Can a set of linearly dependent vectors form a basis for a vector space?

No, a set of linearly dependent vectors cannot form a basis for a vector space. This is because a basis must be a set of linearly independent vectors that span the entire vector space. If a set of vectors is linearly dependent, it means that at least one vector in the set can be written as a linear combination of the other vectors, making it redundant in the basis.

5. How does linear independence relate to the dimension of a vector space?

The dimension of a vector space is equal to the number of linearly independent vectors in a basis for that space. This means that the more linearly independent vectors a space has, the higher its dimension will be. Conversely, if a set of vectors is linearly dependent, it means that some of the vectors are redundant and can be expressed as linear combinations of the others, resulting in a lower dimension for the space.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
258
  • Calculus and Beyond Homework Help
Replies
0
Views
441
  • Calculus and Beyond Homework Help
Replies
19
Views
910
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
786
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
450
  • Calculus and Beyond Homework Help
Replies
10
Views
993
  • Calculus and Beyond Homework Help
Replies
26
Views
2K
  • Calculus and Beyond Homework Help
Replies
14
Views
583
Back
Top