I agree that infinity is an abstraction. What I'm trying to say is that this concept is often abused when it is taken as implicit in real numbers.

"We can only "count" because our physical world is a quantum world. We have units because the basic elements are units, like elementary particles. If the real world were a continuum, there would be no arithmetic."

I don't see it that way. In Euclid's book, variables are assigned to segment lengths and other geometries that tie algebra to geometric interpretations. IMO, when mathematics stray away from something that can be interpreted physically it leads to confusion and errors.

What I'd like to see is a definition of real numbers that is closer to reality and that allows us to encode our knowledge of reality more efficiently. A definition that does not allow abstract limits and infinite precision. Using the "significant digits" interpretation seems to be a step in the right direction to me as all of our measurement and knowledge is subject to some kind of error bar.

We could for example, define a set of real numbers such that we always use as many digit needed so that the quantization error from the limited number of digits is under a hundred times smaller than the error in the value we are measuring. This way, the error caused by the use of this real number system would always explain less than a 1% of the variance of our measurements based on it.

This also seem to require that we distinguish mathematics on natural numbers which represent countable whole items, and mathematics that represent continuous scales which would be best represented by the real numbers system with the limited significant digits.

Now this is just an idea, I'm just an amateur mathematician but I think it could resolve a lot of issues and paradoxes in mathematics.

## Comments (117)

Old