
#1
Dec1609, 02:43 AM

P: 439

This is not a homework , i got this question from the internet.
Prove: http://physicsforums.com/attachment....1&d=1260952985 



#2
Dec1609, 07:05 AM

P: 418

We have:
[tex]\sum_{dn} \frac{n}{d}\tau(d) = \sum_{d  n} d \tau(n/d)[/tex] There is a bijective correspondence between divisors of n/d and multiples of d that divide n. Thus [itex]\tau(n/d)[/itex] equals the number of integers k such that d  k and k  n. Now [itex]\sigma(d)[/itex] equals the sum of all divisors of d so on the lefthand side we have [itex]\tau(n/d)[/itex] terms of d which shows the equality. To establish this through a somewhat more formal argument define: [tex]S = \{(d,k) \,:\, d k, k  n\}[/tex] For a finite subset T of [itex]\mathbb{Z}^2[/itex] define: [tex]w(T) = \sum_{(d,k) \in T} d[/tex] This has the nice property that if [itex]A,B \subseteq \mathbb{Z}^2[/itex], then, [tex]w(A \cup B) = w(A) + w(B)  w(A \cap B)[/tex] We wish to show that both our sides of the equality equal w(S). To do this we decompose S in two ways. First define [tex]S_d = \{(d,k) \,:\, dk, k n\}[/tex] Then [itex]S_d=\tau(n/d)[/itex] so [itex]w(S_d) = d\tau(n/d)[/itex]. We can then show: [tex]w(S) = \sum_{dn} w(S_d) = \sum_{dn} d\tau(n/d)[/tex] This deal with the righthand side. On the other hand let [tex]T_k = \{(d,k) \,:\, dk, k n\}[/tex] Then [itex]w(T_k) = \sum_{dk} d = \sigma(k)[/itex], which shows: [tex]w(S) = \sum_{dn} w(T_d) = \sum_{dn} \sigma(d)[/tex] 



#3
Dec1609, 08:15 AM

P: 439

Hi rasmhop !
I am new to number theoretic function , i don't understand what do you mean by bijective correspondence and your sets notation. I was given a hint to start of by starting to prove for the case of [tex]p^{k}[/tex] instead of n , such that p is a prime number. could you explain in a slightly elementary level. Thanks in advance (= 



#4
Dec1609, 09:02 AM

P: 418

number theory function
Ok. I'm not sure at exactly what level you want the explanation, but let me try a somewhat more elementary approach using your hint. I use some manipulations of sums, but I don't know how familiar you are with this stuff. If there is something you're not clear on just ask. Anyway the general idea is to first prove the case for prime powers, then prove the both the lefthand side and righthand side are multiplicative.
Let us consider the case [itex]n=p^k[/itex] for some prime p and positive integer k. We have [tex]\sigma(p^c) = 1+p+p^2+\cdots+p^c[/tex] so for our lefthand side we have: [tex]\begin{align*} \sum_{d  p^k} \sigma(d) &= \sum_{i=0}^k \sigma(p^i) \\ &= \sum_{i=0}^k \sum_{j=0}^i p^j \end{align*}[/tex] It's clear that [itex]p^j[/itex] occurs in this sum once every time [itex]i \geq j[/itex] so the same number of times as the number of elements of [itex]\{j,j+1,\ldots,k\}[/itex] which is kj+1 so: [tex]\sum_{d  p^k} \sigma(d) = \sum_{j=0}^k (kj+1)p^j[/tex] On the other hand we have [itex]\tau(p^i) = i+1[/itex] which gives: [tex]\begin{align*} \sum_{d  n} \frac{n}{d}\tau(d) &= \sum_{i=0}^k \frac{p^k}{p^i} \tau(p^i) \\ &= \sum_{i=0}^k p^{ki}(i+1)\\ &= \sum_{i=0}^k p^{i}(ki+1) \\ &= \sum_{d  p^k} \sigma(d) \end{align*}[/tex] which shows the case when n is a prime power. Now we need to use this knowledge to gain an understanding of the general case. Let us assume a and b are coprime and that the case has been proven for both a and b (for instance if a and b are different prime powers), and let us show that the result also holds for ab. Then the result will follow by induction. We know that [itex]\tau[/itex] and [itex]\sigma[/itex] are multiplicative so [itex]\tau(ab) = \tau(a)\tau(b)[/itex] and [itex]\sigma(ab)=\sigma(a)\sigma(b)[/itex]. If [itex]d_1  a[/itex] and [itex]d_2  b[/itex], then [itex]d_1d_2ab[/itex]. Also if [itex]d  ab[/itex], then there exists unique positive integers [itex]d_1,d_2[/itex] such that [itex]d_1  a[/itex] and [itex]d_2  b[/itex] (since a and b are coprime). Thus we have: [tex]\sum_{d ab} \frac{ab}{d}\tau(d) = \sum_{d_1a, d_2b} \frac{ab}{d_1d_2}\tau(d_1d_2)[/tex] [tex]\sum_{d ab}\sigma(d) = \sum_{d_1a, d_2b} \sigma(d_1d_2)[/tex] Now we use this to prove that both the lefthand side and the right hand side are multiplicative: [tex]\begin{align*} \sum_{d ab}\sigma(d) &= \sum_{d_1a, d_2b} \sigma(d_1d_2) \\ &= \sum_{d_1a, d_2b} \sigma(d_1)\sigma(d_2) \\ &= \sum_{d_1a}\sum_{d_2  b}\sigma(d_1)\sigma(d_2) \\ &= \sum_{d_1a}\sigma(d_1)\sum_{d_2  b}\sigma(d_2) \\ &= \left(\sum_{d_1a}\sigma(d_1)\right)\left(\sum_{d_2  b}\sigma(d_2)\right) \end{align*}[/tex] And: [tex]\begin{align*} \sum_{dab}\frac{ab}{d}\tau(d) &= \sum_{d_1a,d_2b}\frac{ab}{d_1 d_2}\tau(d_1 d_2) \\ &= \sum_{d_1a,d_2b}\frac{a}{d_1}\tau(d_1)\frac{b}{d_2}\tau(d_2) \\ &= \sum_{d_1a}\left(\frac{a}{d_1}\tau(d_1)\sum_{d_2  b}\frac{b}{d_2}\tau(d_2)\right) \\ &= \left(\sum_{d_1a}\frac{a}{d_1}\tau(d_1)\right)\left(\sum_{d_2  b}\frac{b}{d_2}\tau(d_2)\right) \end{align*}[/tex] Now since we assumed the result was true for a and b we have: [tex]\sum_{dab}\frac{ab}{d}\tau(d) = \left(\sum_{d_1a}\frac{a}{d_1}\tau(d_1)\right)\left(\sum_{d_2  b}\frac{b}{d_2}\tau(d_2)\right) = \left(\sum_{d_1a}\sigma(d_1)\right)\left(\sum_{d_2  b}\sigma(d_2)\right) = \sum_{d ab}\sigma(d)[/tex] 


Register to reply 
Related Discussions  
Number Theory: Euler's Phi Function  Calculus & Beyond Homework  10  
Euler's Phi function Number Theory  Calculus & Beyond Homework  8  
Pi(x) function in number theory solved...  Linear & Abstract Algebra  39  
Pi(x) function in number theory  General Physics  4  
Pi(x) function in number theory solvedĦĦĦ  General Math  7 