Comment on I just cited myself.

<- View Parent
barsoap@lemm.ee ⁨3⁩ ⁨months⁩ ago

Noone in the right state of mind uses decimals as a formalisation of numbers, or as a representation when doing arithmetic.

The deeper understanding of numbers where 0.999… = 1 is obvious needs a foundation of much more advanced math than just decimals

No. If you can accept that 1/3 is 0.333… then you can multiply both sides by three and accept that 1 is 0.99999… Primary school kids understand that. It’s a bid odd but a necessary consequence if you restrict your notation from supporting an arbitrary division to only divisions by ten. And that doesn’t make decimal notation worse than rational notation, or better, it makes it different, rational notation has its own issues like also not having unique forms (2/6 = 1/3) and comparisons (larger/smaller) not being obvious. Various arithmetic on them is also more complicated.

The real take-away is that depending on what you do, one is more convenient than the other. And that’s literally all that notation is judged by in maths: Is it convenient, or not.

source
Sort:hotnewtop