I would agree with you that part (ii) of the proposition is incorrect (unless matrices are not acting on vectors from the left).
If you look at bijection part
http://en.wikipedia.org/wiki/Bijection,_injection_and_surjection
it reads: "If g o f is a bijection, then it can only be concluded...
I think the easiest way to do it is consider the two E's as a single letter thus your words are built from D I S C R EE T - thus you have 7 distinct "letters".
As for have the ordering between the first and last letter, you can ignore this and calculate all possible words then divide by 2 to...
I know this is slightly off topic but I remember an interesting talk in which it was stated that there an infinite number of primes, p, such that the is a prime, q, with the property
p < q < p + 16.
They proved this in an attempt to prove the twin prime conjecture.
If na = 0 (mod m) then na = km for some k. To find all a consider d = gcd(n,m), if we let a = m/d * l for l any positive integer then a is guaranteed to be an integer as d divides m and as d divides n we must have na = m * (n*l/d) which implies na is a integer multiple of m.
I assume that a =...
I was under the impression that it was more sum
S_{p} = \sum_{n=1}^{\infty} \frac{1}{n^{p}}
was only convergent for p greater than one. But if you take its analytic continuation of the function then you obtain the result
[tex] S_{-1} = \frac{-1}{12}. [/itex]
So I think it reduces to a question...
Yes it is true.
Say F maps from A to B then:
1) As F is onto every element in B corresponds to at least one element in A i.e. for all b \in B there is at least one a \in A such that f(a) = b .
2) As F is 1-1 every element in B can correspond to at most one element in A ie. f(a) = f(a')...
The only thing you will need is
f \circ g = g \circ f \hspace{0.3cm} \Rightarrow f(g(v)) = g(f(v)).
If the linear maps are matrices, A and B, then \circ is just the usual matrix multiplication which gives
AB = BA.
This is exactly what you had. I don't think that there is anything else...
I hope this answers in part your questions:
If you have two linear maps
f: V \rightarrow W \mbox{ and } g: V' \rightarrow W'
if you want the two maps to commute then what you are saying is you want the composition of maps to satisfy
f \circ g = g \circ f \hspace{0.3cm} \Rightarrow...
Ok. I interpreted what you said as multiplication by a group element is injective and surjective (hence invertible) thus proving 1 must lay in the set you defined, rather than you proving it is injective and hence invertible.
Matt provides a nice statement which is conceptually useful in terms of groups but it doesn't answer the actual question. It assumes that the binary operation (along with the associated set) satisfy the group axioms. In which case it must be that every element x has an inverse.
It appeared to...
The point I was making was that although through the rearranging you can get the right answer in the end in the middle you made statement which were not entirely true. This is does not matter significantly if you are just looking for a sketch of the proof or aren't required to be completely...
If you are referring to a system of coupled ODE's then 'strong' eigenvalues correspond to the dominate eigenvectors. When I say dominate I mean ones that when time tends to infinity that the system follows a straight line given by the eigenvector.
Suppose you have two coupled 1st order ODE's...
To go from parametric form
x = sin(t) \mbox{ and } y =cos(t),
to the form
x^{2} + y^{2} = 1
you do not use the rearrangement as you have given. What you do is consider
x^{2} + y^{2} = sin^{2}(t) + cos^{2}(t) = 1.
The last step is using a trigonometric identity.
The...
I think the reasoning goes something like this:
Consider bases for some R-module, say A,
\{x_{i}\}_{i=1}^{n} \mbox{ and } \{y_{i}\}_{i=1}^{\infty}
Everything in A can be written as a FINITE linear combination of elements from either basis. That means there is some natural number m with the...
I don't think that your last bit is true in general... I think that
\int_a^b{ f'(x) e^{f(x)}}dx=\left [e^{f(x)} + C \right ]^b_a
is what you mean and you have divided through. This is only correct when the f'(x) is constant... you might have known this, I just found it confusing the way you...
I am trying to follow your approach and the f's confuse me: the problem I have is that
f_{k}(1) = \sum_{r=1}^{\infty} a_{k,r} \geq 1,
but by your recursion relation we have
\frac{f_{k}(1)}{f_{k-1}(1)} = \frac{n-k+1}{1-k}.
The right hand side must be positive while the left must be...
I will have a guess and I think it goes something like this....
We start with
\int_{- \infty}^{\infty}{e^{-x^2}} dx = \sqrt{\pi}
We consider
\int_{- \infty}^{\infty}{e^{-tx^2}} dx
= \int_{- \infty}^{\infty}{e^{-u^2}} \frac{d(t^{-0.5}u)}{du}du
= \int_{- \infty}^{\infty}{e^{-u^2}}...
I guess it depends on what your area is... I am not that deeply involved in the theory of modules and representations thus find it useful to deal with 'nice' representation to get something concrete. For example generating representations of the braid group.
Can you give tell me of any simple...
The reason which I consider irreducible preferable is that in papers which I have read it is generally easier to classify the irreducible reps compared to the indecomposable. Merely that the preference is given to those which are easier to deal with. This is me coming from a more constructive...
3. Generally I have found the main preference is given towards irreducible over those which which are not (this includes indecomposable).
To add a little more for the construction of reps, there are those where the algebra is considered its own module. Also if you have a coproduct then it is...
Consider
C_{4} = <c|c^{4}=e> \mbox{ and } C_{2}\times C_{2} = <a,b|a^{2}=b^{2}=e,ab=ba>
These are clearly not isomorphic as the first has an element of order 4 while the latter does not.
We can choose the subgroups
H = <c^{2}> \mbox{ and } H' = <a>
Thus H and H' are isomorphic...
Hey Matt,
I forgot to say Abelian... I am guessing you realised this by your next comment.
I was under the impression that any finite Abelian group is expressible as the Cartesian products of a finite number of cyclic groups whose orders are prime powers.
I think that if p and q are...
I don't think the groups G and G' are required to be isomorphic. You have already defined G so consider
G' = <a,b|a^{p}=b^{q}=e>
where e is the identity. This should be a suitable counter example -try working through it.
If I am wrong about it being a counter example then this should...
I know this thread is pretty much dead but...
The reason that <c> where c has order mn/d (where d = gcd(m,n)) is the smallest group containing isomorphic copies of <a> and <b> (where |a|=n, |b|=m) is that the order c is precisely the lowest common multiple of n and m.
Is the question what is the smallest group containing a and b (in which case the Cartesian product) or the smallest group that contains isomorphic copies of A and B (in which case I think it is what I wrote before)?
If the dimension of the space is less than two then the only subspace are V and {0} as yyat pointed out. Hence your question is answered in this case.
If the dimension of the space is greater or equal to two then consider spaces X and Y generated by linearly independent vectors x and y. x+y...