Are the Functions Linearly Independent Based on the Matrix of Their Outputs?

In summary, the conversation discusses a question in a linear algebra book about proving the linear independence of functions based on their matrix representation. The proof presented involves showing that the rows of the matrix are linearly independent, which implies that the functions themselves are also linearly independent. However, there is some confusion about the definitions of "basis vectors" and "matrix independence" in the conversation.
  • #1
Alban1806
2
0
There's a question in charles curtis linear algebra book which states:
Let ##f1, f2, f3## be functions in ##\mathscr{F}(R)##.
a. For a set of real numbers ##x_{1},x_{2},x_{3}##, let ##(f_{i}(x_{j}))## be the ##3-by-3## matrix
whose (i,j) entry is ##(f_{i}(x_{j}))##, for ##1\leq i,j \leq 3##. Prove that ##f_{1}, f_{2}, f_{3}## are linearly independent if the rows of the matrix ##(f_{i}(x_{j}))## are linearly independent.

Obviously if ##f_1,f_2,f_3## are in terms of basis vectors than they are linearly independent.
But can I say that if matrix ##A = (f_{i}(x_{j}))## is linearly independent, then they are in echelon form.
Therefore, I can row reduce the matrix to a diagonal matrix s.t. ## a_{i,i} \neq 0##.
Since the rows are linearly independent then ##f_{i} \neq 0 \quad 1 \leq i \leq 3##, therefore for
##\alpha_{i} f_{i} = 0## only if ##\alpha_{i} = 0##.

Is this a good proof for that question?
 
Physics news on Phys.org
  • #2
Alban1806 said:
There's a question in charles curtis linear algebra book which states:
Let ##f1, f2, f3## be functions in ##\mathscr{F}(R)##.
a. For a set of real numbers ##x_{1},x_{2},x_{3}##, let ##(f_{i}(x_{j}))## be the ##3-by-3## matrix
whose (i,j) entry is ##(f_{i}(x_{j}))##, for ##1\leq i,j \leq 3##. Prove that ##f_{1}, f_{2}, f_{3}## are linearly independent if the rows of the matrix ##(f_{i}(x_{j}))## are linearly independent.

Obviously if ##f_1,f_2,f_3## are in terms of basis vectors than they are linearly independent.
I'm not sure what you mean by that. It three vectors are basis vectors then they are independent by definition of "basis". On the other hand, again by definition of "basis", any vectors can be written as a linear combination of basis vectors.

But can I say that if matrix ##A = (f_{i}(x_{j}))## is linearly independent, then they are in echelon form.
What do you mean by the "matrix A" being independent? That its rows are independent vectors?

Therefore, I can row reduce the matrix to a diagonal matrix s.t. ## a_{i,i} \neq 0##.
Since the rows are linearly independent then ##f_{i} \neq 0 \quad 1 \leq i \leq 3##, therefore for
##\alpha_{i} f_{i} = 0## only if ##\alpha_{i} = 0##.

Is this a good proof for that question?
Saying that ##f_1, f_2, f_3## are linearly independent means that [itex]\alpha f_1+ \beta f_2+ \gamma f_3= 0[/itex], which in turn means [itex]\alpha f_1(x)+ \beta f_2(x)|+ \gamma f_3(x)= 0[/itex] for all x. So take x equal to ##x_1, x_2, x_3## in turn.
 
  • #3
From what I understand from the question, the matrix ## f_{i}(x_{j}) ## is a ## 3 x 3 ## matrix whose rows are linearly independent.
That means that we are given that ## \alpha f_{1} + \beta f_{2} + \gamma f_{3} = 0 ## implies that ## \alpha = \beta = \gamma = 0 ##.
I believe I have to show that each ## f_{i} ## is linearly independent.

Therefore when I say the rows are linearly independent is not the same thing as the ## f_{i} ##'s being linearly independent to clarify.

But I hope my interpretation of the question is correct.
 

1. What is linear independence?

Linear independence refers to the property of a set of vectors where none of the vectors can be expressed as a linear combination of the others. In other words, no vector in the set is redundant and each vector contributes a unique direction or span to the set.

2. Why is linear independence important?

Linear independence is important because it allows us to define a unique basis for a vector space, which is crucial for solving systems of linear equations and performing other operations in linear algebra. It also helps us identify which vectors are essential in a set and which ones can be removed without changing the overall span.

3. How do you determine linear independence?

To determine linear independence, we can use the definition and check if any vector in the set can be expressed as a linear combination of the others. Alternatively, we can use the determinant method for square matrices, where we create a matrix with the vectors as columns and calculate the determinant. If the determinant is non-zero, the vectors are linearly independent.

4. What is the difference between linear independence and linear dependence?

Linear independence and linear dependence are opposite concepts. As mentioned earlier, linear independence means that no vector in a set can be expressed as a linear combination of the others. Linear dependence, on the other hand, means that at least one vector in a set can be expressed as a linear combination of the others. In other words, linear dependence indicates that there is redundancy in the set.

5. How does linear independence relate to the rank of a matrix?

The rank of a matrix is the number of linearly independent columns or rows in that matrix. Therefore, the rank of a matrix is closely related to the concept of linear independence. If the rank of a matrix is equal to the number of columns (or rows), then the columns (or rows) are linearly independent. If the rank is less than the number of columns (or rows), then there is linear dependence in the matrix.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
774
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
665
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
825
  • Linear and Abstract Algebra
Replies
4
Views
873
Back
Top