B Vector Space Prob: Show Linear Dependence

Raghav Gupta
Messages
1,010
Reaction score
76
Show that a set of vectors are linearly dependent if and only if anyone of the vectors can be represented as linear combination of the remaining vectors.

I don't know these terms. Vectors I know apart from that other terms. Can someone provide some information in any form for solving this question.
 
Physics news on Phys.org
Any vector A can be written as a sum of basis vectors and magnitudes as in ## A = \sum_{i=1}^n a_i \vec e _i##. In common practice, you will see them as ##A = a_x \hat x + a_y \hat y + a_z \hat z## for 3D space.
A linear dependent set of vectors, I only know by the definition you posted. However, a linearly independent set of k vectors can be defined by the fact that they span a k - dimensional space.
A linear combination of vectors ##\{ \vec v_i \} _{i=1}^N## is any combination of coefficients ##x_i, i=1...N##, in the form ## \vec L = \sum_{i=1}^N x_i \vec v _i##.

Linear independence has been defined by there being only one set of x_i 's that can make ##\vec L = 0## and that is for all the x_i's to be zero.

For this problem, you are to show that if a set of vectors is linearly dependent, then at least one of the vectors can be written as a linear combination of the others; and if one of the vectors can be written as a linear combination of the others then set of vectors is linearly dependent.

You will need to know what the question is assuming you know as the definition of linearly dependent vectors.

If a set of N vectors is linearly dependent, then at most it can span N-1 dimensions...you could use this to show that it must be a linear combination of the other vectors.
 
Raghav Gupta said:
Show that a set of vectors are linearly dependent if and only if anyone of the vectors can be represented as linear combination of the remaining vectors.

I don't know these terms. Vectors I know apart from that other terms. Can someone provide some information in any form for solving this question.

Start by looking up the terms.
 
micromass said:
Start by looking up the terms.
Yes. Every linear algebra textbook defines the terms "linearly dependent," "linearly independent," and "linear combination."
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top