Are the Functions Linearly Independent Based on the Matrix of Their Outputs?

Click For Summary
SUMMARY

The discussion centers on the linear independence of functions \( f_1, f_2, f_3 \) in the context of a \( 3 \times 3 \) matrix \( (f_{i}(x_{j})) \) derived from their outputs at specific real numbers \( x_1, x_2, x_3 \). It is established that if the rows of this matrix are linearly independent, then the functions themselves are also linearly independent. The proof hinges on the properties of row reduction to echelon form, demonstrating that non-zero entries in the diagonal imply that the only solution to the linear combination \( \alpha_i f_i = 0 \) is when all coefficients \( \alpha_i \) are zero.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically linear independence.
  • Familiarity with matrix operations, including row reduction and echelon form.
  • Knowledge of function spaces, particularly \( \mathscr{F}(R) \).
  • Ability to interpret and manipulate linear combinations of functions.
NEXT STEPS
  • Study the properties of linear independence in function spaces.
  • Learn about matrix row reduction techniques and their implications for linear systems.
  • Explore the concept of basis vectors and their role in defining linear independence.
  • Investigate applications of linear independence in differential equations and functional analysis.
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, functional analysis, or anyone involved in theoretical aspects of vector spaces and their properties.

Alban1806
Messages
2
Reaction score
0
There's a question in charles curtis linear algebra book which states:
Let ##f1, f2, f3## be functions in ##\mathscr{F}(R)##.
a. For a set of real numbers ##x_{1},x_{2},x_{3}##, let ##(f_{i}(x_{j}))## be the ##3-by-3## matrix
whose (i,j) entry is ##(f_{i}(x_{j}))##, for ##1\leq i,j \leq 3##. Prove that ##f_{1}, f_{2}, f_{3}## are linearly independent if the rows of the matrix ##(f_{i}(x_{j}))## are linearly independent.

Obviously if ##f_1,f_2,f_3## are in terms of basis vectors than they are linearly independent.
But can I say that if matrix ##A = (f_{i}(x_{j}))## is linearly independent, then they are in echelon form.
Therefore, I can row reduce the matrix to a diagonal matrix s.t. ## a_{i,i} \neq 0##.
Since the rows are linearly independent then ##f_{i} \neq 0 \quad 1 \leq i \leq 3##, therefore for
##\alpha_{i} f_{i} = 0## only if ##\alpha_{i} = 0##.

Is this a good proof for that question?
 
Physics news on Phys.org
Alban1806 said:
There's a question in charles curtis linear algebra book which states:
Let ##f1, f2, f3## be functions in ##\mathscr{F}(R)##.
a. For a set of real numbers ##x_{1},x_{2},x_{3}##, let ##(f_{i}(x_{j}))## be the ##3-by-3## matrix
whose (i,j) entry is ##(f_{i}(x_{j}))##, for ##1\leq i,j \leq 3##. Prove that ##f_{1}, f_{2}, f_{3}## are linearly independent if the rows of the matrix ##(f_{i}(x_{j}))## are linearly independent.

Obviously if ##f_1,f_2,f_3## are in terms of basis vectors than they are linearly independent.
I'm not sure what you mean by that. It three vectors are basis vectors then they are independent by definition of "basis". On the other hand, again by definition of "basis", any vectors can be written as a linear combination of basis vectors.

But can I say that if matrix ##A = (f_{i}(x_{j}))## is linearly independent, then they are in echelon form.
What do you mean by the "matrix A" being independent? That its rows are independent vectors?

Therefore, I can row reduce the matrix to a diagonal matrix s.t. ## a_{i,i} \neq 0##.
Since the rows are linearly independent then ##f_{i} \neq 0 \quad 1 \leq i \leq 3##, therefore for
##\alpha_{i} f_{i} = 0## only if ##\alpha_{i} = 0##.

Is this a good proof for that question?
Saying that ##f_1, f_2, f_3## are linearly independent means that \alpha f_1+ \beta f_2+ \gamma f_3= 0, which in turn means \alpha f_1(x)+ \beta f_2(x)|+ \gamma f_3(x)= 0 for all x. So take x equal to ##x_1, x_2, x_3## in turn.
 
From what I understand from the question, the matrix ## f_{i}(x_{j}) ## is a ## 3 x 3 ## matrix whose rows are linearly independent.
That means that we are given that ## \alpha f_{1} + \beta f_{2} + \gamma f_{3} = 0 ## implies that ## \alpha = \beta = \gamma = 0 ##.
I believe I have to show that each ## f_{i} ## is linearly independent.

Therefore when I say the rows are linearly independent is not the same thing as the ## f_{i} ##'s being linearly independent to clarify.

But I hope my interpretation of the question is correct.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K