Can 10^n + 1 Be Expressed as a*a*c?

AI Thread Summary
The discussion centers on proving that 10^n + 1 cannot be expressed as a*a*c, where n, a, and c are positive integers. Initial analysis involves examining the remainders of 10^n + 1 when divided by 3, leading to conclusions about the forms of a and c. Several cases are explored, revealing that certain configurations lead to contradictions regarding remainders. A further investigation into the prime factorization of 10^n + 1 suggests that its factors occur only once, supporting the assertion that it cannot be expressed in the desired form. The thread concludes that if a and c are strictly less than 10^n + 1, a more straightforward proof is possible by demonstrating that the derived expressions cannot yield integer values.
Pythagorean12
Messages
5
Reaction score
0

Homework Statement


Prove that 10^n + 1 cannot be expressed in the form a*a*c, where n,a,c are positive integers.


Homework Equations


By considering the reminder of 10^n + 1 when it is devided by 3, I arrived at the conclusion that:

a*a = 3z +1;
c = 3v + 2

for some positive integers z,v.

Here I am stuck, I have no idea how to proceed


The Attempt at a Solution

 
Physics news on Phys.org
Pythagorean12 said:
By considering the reminder of 10^n + 1 when it is devided by 3, I arrived at the conclusion that:
a*a = 3z +1;
c = 3v + 2
for some positive integers z,v.
How did you arrive at this conclusion? You don't show what you got when you divided 10n + 1 by 3.
 
Let's consider 3 possible cases:

1) a = 3k. This case is impossible, since from equality 10^n + 1 = a*a*c we notice that the remainder, when LHS is divided by 3, is always 2, and the remainder, when RHS is divided by 3, is 0;

2) a = 3k + 1. Then a*a = 9k*k + 6k + 1 - the remainder, when a is divided by 3, is 1

3) a = 3k + 2. Then a*a = 9k*k + 12k + 4 - the remainder, when a is divided by 3, is 1

Hence, a*a can be expressed as a*a = 3z + 1

Now let's consider 3 possible cases for c:

1) c = 3k - this case is impossible due to the same reason explained in case a = 3k.

2) c= 3k + 1. Then:

a*a*c = (3z + 1)(3k + 1) = 9kz + 3z + 3k + 1. Dividing this by 3 gives remainder 1, while dividing 10^n + 1 by 3 gives remainder 2

3) c= 3k + 2. Then:

a*a*c = (3z + 1)(3k + 2)= 9kz + 6z + 3k + 2. Dividing this by 3 gives remainder 2, and dividing 10^n + 1 by 3 gives remainder 2.

Hence, c= 3k + 2.



(The real problem I want to solve is formulated like that:

The repeat of a natural number is obtained by writing it twice in a row (for example, the repeat of 356 is 356356). Is there any number whose repeat is a perfecr square.

Let's denote the n-digit number by b. Then we have:

b*10^n + b = c*c;
b(10^n + 1)= c*c

I have made a little investigation for 10^n + 1:

n
1 10^n + 1=11 - prime number
2 10^n + 1=101 - prime number
3 10^n + 1=1001=7*11*13
4 10^n + 1=10001=73*137
5 10^n + 1=100001=11*9091
6 10^n + 1=1000001=101*9901
7 10^n + 1=10000001=11*909091
8 10^n + 1=100000001=17*5882353
9 10^n + 1=1000000001= 7*11*13*19*52579

Form this, we notice that each prime factor of 10^n + 1 occurs only once. Hence, I have come to the sketch of the proof that no such number whose repeat is a perfecr square exists:

1)10^n + 1 is a prime number.
Then it must be a= e*e(10^n + 1) -> a is a n+1 digit number -> contradiction

2) 10^n + 1 is not a prime number

Each prime factor of 10^n + 1 occur only once (as "deduced form examples above")
I try to prove last statement by contradiction. Let's assume that 10^n + 1 = a*a*c (a*a means that some prime factors occur more that once). The work for this part was described above.

Hence, if this last unproven statement is true, then a = e*e(10^n + 1) -> a is a n+1 digit number -> contradiction.

)
 
While this may be an old thread I stumbled across and felt I should add some input.

This statement cannot be proven as it is currently written.

Ex. Assign c the value 10^n + 1. Divide LHS by c. This implies that a^2 = 1 which implies a = 1. Therefore for all n 10^n + 1 can be written as a*a*c.

If it was meant that a and c must be strictly less than 10^n + 1 then the statement can be much more easily proven then the route you were taking.

Start by dividing LHS by c. Then it must be that ( 10^n + 1 ) / c is a perfect square since a is a positive integer. This implies that \sqrt{ 10^n / c + 1 / c } is also an integer. All that's left for you to do is show that for every c < 10^n + 1 , that \sqrt{ 10^n / c + 1 / c } is never an integer value.

Hint: it would suffice to prove that the expression inside the brackets is itself never an integer value for all allowed values of c.
 
Back
Top