All Turing-complete modes of computation are isomorphic so binary or not is irrelevant. Both silicon computers and human brains are Turing-complete, both can compute all computable functions (given enough time and scratch paper).
If non-determinism even exists in the real world (it clashes with cause and effect in a rather fundamental manner) then the architecture of brains, nay the life we know in general, actively works towards minimising its impact. Like, copying the genome has a quite high error rate at first, then error correction is applied which bring the error rate to practically zero, then randomness is introduced in strategic places, influenced by environmental factors. When the finch genome sees that an individual does not get enough food it throws dice at the beak shape, not mitochondrial DNA.
It’s actually quite obvious in AI models: The reason we can quantise them, essentially rounding every weight of the model to be able to run them with lower-precision maths so they run faster and with less memory, is because the architecture is ludicrously resistant to noise, and rounding every number is equivalent to adding noise from the perspective of the weights. It’s just very conveniently chosen noise.