PeroK said:
##0.3-0.2 = 0.1##
I don't care what the computer thinks it's doing instead. It gives the wrong answer.
No, it isn't. You are telling the computer to do the wrong thing. The computer is doing what you told it to do. It's up to you to understand what you are telling it to do.
If you write ##0.3 - 0.2## in Python, without using the Decimal method described earlier, you are telling the computer to do floating point arithmetic as
@DrClaude described in post #16. If that's not what you
want to do, then you need to tell the computer to do what you
want to do, namely, the exact decimal subtraction 0.3 - 0.2, as
@DrClaude also described in post #16.
You could complain that you think Python should
default to decimal subtraction instead of floating point subtraction when you write ##0.3 - 0.2##, but the proper venue for that complaint is the Python mailing lists. And unless Python makes that change to its language specification, Python does what it does, and it's up to you to understand the tools you are using. It's not up to the tools to read your mind and change their behavior according to what you were thinking.
PeroK said:
When using a language like Python it pays to know its limitations and faults, but they are limitations and faults of the computer system.
If you think you can design a computer system that doesn't have these limitations and faults, go to it. Python is open source. Nothing is stopping you from producing your own hacked copy of the interpreter in which writing ##0.3 - 0.2## does a decimal subtraction instead of a floating point subtraction. The result won't meet the Python language specification, of course, but whether that's a problem for you depends on what you intend to use your custom interpreter for.