Linear Independence of Subsets: Necessary and Sufficient Conditions

radou
Homework Helper
Messages
3,148
Reaction score
8
Let V be a vector space over a field F, v_{1}, \cdots, v_{n} \in V and \alpha_{1}, \cdots, \alpha_{n} \in F. Further on, let the set \left\{v_{1}, \cdots, v_{n}\right\} be linearly independent, and b be a vector defined with b=\sum_{i=1}^n \alpha_{i}v_{i}. One has to find necessary and sufficient conditions on the scalars \alpha_{1}, \cdots, \alpha_{n} such that the set S=\left\{b, v_{2}, \cdots, v_{n}\right\} is linearly independent, too.

Well, I just used a simple proposition which states that a set is dependent if there exists at least one vector from that set which can be shown as a linear combination of the rest of the vectors from the same set. So, obviously, for \alpha_{1} = 0, the set S is dependent, which makes \alpha_{1} \neq 0 a necessary condition for S to be independent. Further on, a sufficient condition would be \alpha_{1} = \cdots = \alpha_{n} = 0, which leaves us with the set S\{b}. This set must be linearly independent, since it is a subset of {v1, ..., vn}, which we know is linearly independent.

I may be boring, but I'm just checking if my reasoning is allright.. :smile:

Edit. I just realized, for \alpha_{1} = \cdots = \alpha_{n} = 0 we have b = 0, which makes the set dependent! So \alpha_{1} \neq 0 is a necessary condition, but what's the sufficient condition? Is it that at least one of the scalars \alpha_{2}, \cdots, \alpha_{n}must not be equal zero?
 
Last edited:
Physics news on Phys.org
The sufficient condition certainly includes this \alpha_{1} \neq 0

Basically, if a condition is necessary and sufficient, it means that the statement is true iff the condition is true. So search for a set of conditions that are true iff the set b, v 2-n is linearly independent.

Note that if all the alphas except \alpha_1 are zero, then b is just a multiple of v1, and the new set is certainly linearly independent
 
Yes, I realized that.. So, it seems \alpha_{1} \neq 0 is both necessary and sufficient condition, since the rest of the scalars can be any elements of F, all zero, all non zero, or combined.
 
Last edited:
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top