If it’s worth saying, but not worth its own post, here's a place to put it.
If you are new to LessWrong, here's the place to introduce yourself. Personal stories, anecdotes, or just general comments on how you found us and what you hope to get from the site and community are invited. This is also the place to discuss feature requests and other ideas you have for the site, if you don't want to write a full top-level post.
If you're new to the community, you can start reading the Highlights from the Sequences, a collection of posts about the core ideas of LessWrong.
If you want to explore the community more, I recommend reading the Library, checking recent Curated posts, seeing if there are any meetups in your area, and checking out the Getting Started section of the LessWrong FAQ. If you want to orient to the content on the site, you can also check out the Concepts section.
The Open Thread tag is here. The Open Thread sequence is here.
When rereading [0 and 1 Are Not Probabilities], I thought: can we ever specify our amount of information in infinite domains, perhaps with something resembling hyperreals?
(It's worth noting that we have codes that can encode any specific rational number with a finite word - for instance, first apply bijection of rationals to natural numbers, then use Fibonacci coding; but in expectation we need to receive infinite bits to know an arbitrary number).
Since ∞ symbol doesn't have nice properties with regards to addition and subtraction, we might define a symbol ΛN which means "we need some information to single out one natural number out of their full set". Then, the uniform prior over Q would have form [0:⋯:0:2−ΛN:2−ΛN:⋯:0] (prefix and suffix standing for values outside [0;1] segment) while a communication "the number is k" would carry ΛN bits of evidence on average, making the posterior [⋯:0:20:2−ΛN:2−ΛN:…]∼[⋯:0:1:0:0:…].
If we've been communicated ΛN information about x, we clearly have learned nothing about y and thus cannot pinpoint the specific point, requiring ΛN more bits.
However, there's bijection between Q2 and N, so we can assign a unique natural number to any point in the square, and therefore can communicate it in ΛN bits in expectation, without any coefficient 2.
When I tried exploring some more, I've validated that greater uncertainty (ΛR, communication of one real number) makes smaller ones (ΛN) negligible, and that evidence for a natural number can presumably be squeezed into communication for a real value. That also makes the direction look unpromising.
However, there can be a continuation still: are there books/articles on how information is quantified given a distribution function?