Comment on I just cited myself.

<- View Parent
pyre@lemmy.world ⁨4⁩ ⁨months⁩ ago

you’re thinking about this backwards: the decimal notation isn’t something that’s natural, it’s just a way to represent numbers that we invented. 0.333… = 1/3 because that’s the way we decided to represent 1/3 in decimals. the problem here isn’t that 1 cannot be divided by 3 at all, it’s that 10 cannot be divided by 3 and give a whole number. and because we use the decimal system, we have to notate it using infinite repeating numbers but that doesn’t change the value of 1/3 or 10/3.

different bases don’t change the values either. 12 can be divided by 3 and give a whole number, so we don’t need infinite digits. but both 0.333… in decimal and 0.4 in base12 are still 1/3.

there’s no need to change the base. we know a third of one is a third and three thirds is one. how you notate it doesn’t change this at all.

source
Sort:hotnewtop