Linear Dependence Check: [-1, 1]

AngeloG
Messages
103
Reaction score
0
Check for Linear Dependence for: \sin \pi x [-1, 1]

I'm thinking it's Linear Dependent. Since it says that any linear combination must be 0.

a*x + b*y = 0, a = b = 0.

So for any integer x, the value is 0. So [-1, 1] works.
 
Physics news on Phys.org
? "Linear Dependence" or independence applies to a set of vectors. Certainly we can think of the collection of functions over [-1, 1] as a vector space but still sin \pi x is a single function!

Also Linear Depence of a set of vectors does NOT mean "any linear combination must be 0". Only that there exist at least one more linear combination other than the one where all coefficients are 0. In order that two functions, f and g, be dependent, there must be a and b, not both 0 so that af(x)+ bg(x)= 0 for all x.

But still, what set of functions are you talking about? A single non-zero function (vector) is always independent.
 
The question is:

Check the linear dependency of the functions sin(pi x).
 
You said "functions" but there's only one non-zero function. What kind of values can x take?
 
Err, it was part of:

1, cos(pi x), sin(pi x).

Those are the functions. 1 is linear independent, cos(pi x) and sin(pi x) I'm not sure about.
 
If you don't even know enough to quote the problem correctly, then I strongly recommend you review what "dependent" and "independent" mean! Once again, a single function (vector) is always "independent"! It makes no sense at all to say "1 is linearly independent" and, again, the problem is NOT asking about the "dependence" or "independence" of the each of those three functions. It is asking, as I suggested before, about the dependence or independence of the set of those three functions.

Now, how does your textbook define "dependent functions" or "dependent vectors"?
 
Last edited by a moderator:
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top