1=0.99999.... true?by Sabine Tags: None 

#1
Jun405, 07:20 AM

P: 41

x= 0.99999....
10x = 9.999999..... 10x= 9+ 0.99999..... 10xx = 9 9x=9 x=1 =====> 1= 0.99999.... is there smthg wrong? 



#2
Jun405, 07:44 AM

P: 3

Nope. .999... (infinitely repeating) is indistinguishable from 1.




#3
Jun405, 08:34 AM

P: 579

Perhaps, if they feel like being generous, they would accept them as approximations. But in a hardheaded mode of philosophysical analysis, such approximations would not pass for sound and conclusive mathematical truths. Infact, leaving philosophers aside, in the real world such mpropositions would not stand a chance. Simply, it would be practically unaceptable in many practical circumstances or situations. For example, take a peny out of ten pounds (£10) and what results is £9.99. Without wandering too far to draw a concret example, I have personally encountered several instances where the shopkeeeper resfused to accept a price with a missing penny (ie refused to accept £9.99 for a £10 price.) Practically, it seems that these shopkeepers refused because of the missing penny (£0.01). Also, if you take a penny out of one million pounds you are no longer a millionaire for a very obvious practical reason (£1,000,000  £0.01 = 999,999.99). IMPORTANT POINT Mathematicians must keep 'Absolute Truths' separate from 'Natural Approximations'. They are two fundamentally different things. Philosophers, especially those in the above listed disciplines, rigorously enforce this distinction. The world where we can avoid fractions and the usual associated vagueness is currently a distanced dream, if not completely an impossiblity. I have suggested elsehwere on this PF the need to start looking at the "MATHEMATICS OF 'THE PERFECT FIT'" that governs a 'paraplexed world or universe'. 



#4
Jun405, 08:35 AM

P: 212

1=0.99999.... true? 



#5
Jun405, 09:10 AM

P: 579

In Logic, the standard assumption handed down from Aristotle to us is that, everything is selfidentical or is identical to itself. My argument is that even in mathematics, the proposition 1 = 0.999 fundamentally violates Aristotle's First Law of Identity. In Aristotelian Logical system 1 = 1 would surfice becuase it respects completely (that is, in nonapproximate way) Aristotle's First Law of identity. NOTE: Notice that here I am not trying to play down all the noises made in mathematics. Here I am only stating how such mpropositions would be confronted in various philosophical disciplines. Like I have warned, ABSOUTE TRUTHVALUES should be kept separate from APPROXIMATE TRUTHVALUES, regardless of which quantitative disciplines that we are in. 



#6
Jun405, 09:24 AM

Emeritus
Sci Advisor
PF Gold
P: 16,101

Well, this isn't the philosophy forum, so .
The usual fallacy from which most suffer when objecting to 1 = 0.999... is that they are confusing a mathematical object with the notation used for that object. And frankly, it boggles me why people have such trouble working through this confusion: they have no problem with 10/15 and 14/21 being different notations for the same object. 



#7
Jun405, 11:07 AM

Sci Advisor
P: 905





#8
Jun405, 11:50 AM

Mentor
P: 7,292





#9
Jun405, 11:56 AM

Sci Advisor
HW Helper
P: 9,428

perhaps philo thought the dots were an ellipsis, as if we had said 1 = .9999, and then wandered off mentally for a while.
it might have been better then to write 1 = (.99999....). but i doubt it will catch on. 


#10
Jun405, 12:25 PM

P: n/a

Is " 0.999... " notationally rigorous? What if I use [tex]0.99\Bar{9}[/tex] instead?




#11
Jun405, 02:23 PM

Sci Advisor
HW Helper
P: 9,428

i think you are right that .999... is used also for other numbers than 1, but your notation is not.



#12
Jun405, 02:44 PM

P: n/a

In response to the OP, I think his conclusion is valid but I am used to a little more in a proof. Since I didn't see any correction to what I said in a previous thread on the subject, I think that assuming an infinitely repeating decimal can be immediately compared to a real number is a little jump in logic (although we do it all the time with no mental anguish)




#13
Jun405, 03:51 PM

Sci Advisor
HW Helper
P: 9,428

to assign a real number to an infinite decimal, one needs a definition of what that real number is, and a proof it exists as defined. the simplest definition (for positive ones) is the "smallest real number not smaller than any of the finite decimal truncations".
this is entirely equivalent to, but more fundamental than, the infinite series definition, since the series definition depends on this one for its proof of convergence. thus it is to me a bit silly to use the infinite series definition in this simple context. i.e. rigorously explained, the infinite series definition, and its justification go like this: 1)define the infinite decimal .a1 a2 a3 ....an..... as the "sum" of the series a1/10 + a2//(10)^2 +....+an/(10)^n +...... 2) to prove that series has a sum, we define the sum as the limit of the sequence of partial sums, and then we must prove that sequence has a limit. 3) then we state the theorem that a monotone increasing sequence has a limit and prove it by invoking the lub axiom of reals, the proof being that the limit is the smallest number not smaller than any element of the sequence (sound familiar?). finally, 4) the sequence of partial sums is merely the sequence of finite decimal truncations, hence the limit of the sequence of partial sums of the infinite series determined by the decimal, is indeed the "smallest real number not smaller than any of the finite decimal truncations". i.e. we are back where we started. Thus one might as well, or better, make that the definition in the first place, especially with beginning classses who do not know what an infinite series is. [hopefully no one will maintain that the concept of infinite series is more basic than that of a real number.] now the proof that 1 is the smallest number not smaller than any finite decimal of form .9999...9, is the same in spirit as the proof of the sum of the geometric series, only easier. i.e. the difference 1  .99999....9 (n terms), is 1/(10)^n = .00....001, (n1 zeroes), which is eventually smaller than any given finite decimal, hence no number smaller than 1 can be as large as all these finite decimals. i say again that anyone who does not understand this, cannot possibly have understood calculus as any more than a basket of formulas. 



#14
Jun405, 07:50 PM

P: 26




Register to reply 
Related Discussions  
9.99999 = 10?  General Math  2  
0.99999..... = 1  General Math  3  
0.99999.... = 1?  General Math  1  
if: 1^x=1^y and as: 1^2=1^99 then: 1=99  General Math  6 