Any vector A can be written as a sum of basis vectors and magnitudes as in ## A = \sum_{i=1}^n a_i \vec e _i##. In common practice, you will see them as ##A = a_x \hat x + a_y \hat y + a_z \hat z## for 3D space.
A linear dependent set of vectors, I only know by the definition you posted. However, a linearly independent set of k vectors can be defined by the fact that they span a k - dimensional space.
A linear combination of vectors ##\{ \vec v_i \} _{i=1}^N## is any combination of coefficients ##x_i, i=1...N##, in the form ## \vec L = \sum_{i=1}^N x_i \vec v _i##.
Linear independence has been defined by there being only one set of x_i 's that can make ##\vec L = 0## and that is for all the x_i's to be zero.
For this problem, you are to show that if a set of vectors is linearly dependent, then at least one of the vectors can be written as a linear combination of the others; and if one of the vectors can be written as a linear combination of the others then set of vectors is linearly dependent.
You will need to know what the question is assuming you know as the definition of linearly dependent vectors.
If a set of N vectors is linearly dependent, then at most it can span N-1 dimensions...you could use this to show that it must be a linear combination of the other vectors.