JoshuaZ comments on Open thread for December 17-23, 2013 - Less Wrong

5 Post author: ciphergoth 17 December 2013 08:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (301)

You are viewing a single comment's thread.

Comment author: JoshuaZ 18 December 2013 04:39:12AM 4 points [-]

There's been some prior discussion here about the problem of uncertainty of mathematical statements. Since most standard priors (e.g. Solomonoff) assume that one can do a large amount of unbounded arithmetic, issues of assigning confidence to say 53 being prime are difficult, as are issues connected to open mathematical problems (e.g. how does one discuss how likely one is to estimate that the Riemann hypothesis is true in ZFC?). The problem of bounded rationality here seems serious.

I've run across something that may be related, and at minimum seems hard to formalize. For a mathematical statement A, let F(A) be "A is proveable in ZFC) (you could use some other axiomatic system but this seems fine for now). Let G(A) be "A will be proven in ZFC by 2050". Then one can give examples of statements A and B, where it seems like P(F(A)) is larger than P(F(B)) but the reverse holds for P(G(A)) and P(G(B)).

The example that originally came to mind is technical: let A be the statement "ZPP is contained in P^X where X is an oracle for graph isomorphism" and let B be the statement "ZPP is contained in P^Y where Y is an oracle that answers whether Ackermann(n)+1 has an even or odd number of distinct prime factors." The intuition here is that one expects Ackermann(n)+1 to be essentially random in the parity of its number of distinct prime factors, and a strong source of pseudorandom bits forces collapse of ZPP. However, actually proving that Ackermann(n)+1 acts this way looks completely intractable. In contrast, there's no strong prior reason to think graph isomorphism has anything to do with making ZPP type problems easier (aside from some very minor aspects) but there's a lot of machinery out there that involves graph isomorphism and people thinking about it.

So, is this sort of thing meaningful? And are there other more straightforward, less complicated or less technical examples? I do have an analog involving not math but space exploration. P(Life on Mars) might be lower than P(Life on Europa) even though P(We discover life on Mars in the next 20 years) might be higher than P(We discover life on Europa in the next 20 years) simply because we send so many more probes to Mars. Is this a helpful analog or is it completely different?

Comment author: Khoth 19 December 2013 07:35:51AM *  4 points [-]

How about the statements:

A: "The number of prime factors of 4678946132165798721321 is divisible by 3"

B: "The number of prime factors of 9876216987326578968732678968432126877 8498415465468 5432159878453213659873 1987654164163415874987 3674145748126589681321826878 79216876516857651 64549687962165468765632 132185913574684613213557 is divisible by 2"

P(F(A)) is about 1/3 and P(F(B)) is about 1/2.

But it's far more likely that someone will bother to prove A, just because the number is much smaller.

ETA: To clarify, I don't expect it to be particularly hard to prove or disprove, I just don't think anyone will bother.

Comment author: ChristianKl 19 December 2013 04:32:38PM 0 points [-]

Whether someone will bother really depends on why someone wants to know. You can simple type "primefactors of 9876216987326578968732678968432126877" into Wolfram Alpha and get your answer. It's not harder than typing "primefactors of 4678946132165798721321" into Wolfram Alpha

Comment author: Oscar_Cunningham 19 December 2013 06:31:52PM 3 points [-]

I don't know if this was due to an edit, but the second number in Khoth's post is far larger than 9876216987326578968732678968432126877, and indeed Alpha won't factor it.

To be honest I'm sort of surprised that Alpha is happy to factor 4678946132165798721321, I'd have thought that that was already too large.

Comment author: Khoth 19 December 2013 05:42:54PM 0 points [-]

The reason nobody will bother is that it's just one 200 digit number among another 10^200 similar numbers. Even if you care about one of them enough to ask Wolfram Alpha, it's vanishingly unlikely to be that particular one.

Comment author: RolfAndreassen 19 December 2013 04:45:06PM 0 points [-]

Technically it is harder, since there are more digits; apart from the additional work involved this also makes more opportunities for mistakes. In addition, of course, the computer at the other end is going to have to do more work.

Comment author: DanielLC 18 December 2013 06:10:16AM 1 point [-]

If there's some new hypothesis it's likely to be proven or disproven quickly. If you look at an old one, like the Riemann hypothesis, that people have tried and failed to prove or disprove, it probabily won't be proven or disproven any time soon. Thus, it's not hard to find something more likely to be proven quickly than the Riemann hypothesis, but is still less likely to be true.

Comment author: Anatoly_Vorobey 18 December 2013 09:42:49PM 0 points [-]

Let A = "pi is normal", and B = "pi includes in it as a contiguous block the first 2^128 digits of e". B is more likely to be provable in ZFC, simply because A requires B but not vice versa. A is vastly more likely to be proven by 2050. Is this a valid example, or do you see it as cheating in some way?

I'm not sure if this question is meaningful/interesting. It may be, but I'm not seeing it.

Comment author: JoshuaZ 19 December 2013 02:13:40AM 4 points [-]

Suggested repair of your example: A= "Pi is normal" and B= "Pi includes as a contiguous block the first 2^128 digits of e within the first Ackermann(8) digits" which should do something similar.

Comment author: Oscar_Cunningham 18 December 2013 10:25:53PM 4 points [-]

Doesn't the fact that A implies B mean that it's very easy to prove B once you've proved A?

Comment author: Anatoly_Vorobey 18 December 2013 10:46:17PM 4 points [-]

You're right, I blundered and this example is no good.