You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

paulfchristiano comments on What independence between ZFC and P vs NP would imply - Less Wrong Discussion

1 Post author: alexflint 08 December 2011 02:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (62)

You are viewing a single comment's thread. Show more comments above.

Comment author: paulfchristiano 08 December 2011 06:02:12PM 10 points [-]

I can write down a polynomial time algorithm which solves SAT (for all but finitely many instances at least), provided that in fact P = NP. The difficulty is proving that this particular algorithm works, which could easily be true but independent of ZFC.

Comment author: dlthomas 08 December 2011 07:23:45PM 1 point [-]

Elaborate, please.

Comment author: paulfchristiano 08 December 2011 08:07:51PM *  8 points [-]

I'll write down an algorithm that solves SAT in time N^k if any algorithm does.

On SAT instances of size T, enumerate all algorithms of size up to log(T); for each one, consider all SAT instances of size up to log(T); run each algorithm for time log(T)^k, outputing 0 if they don't halt, and compare its output to the result of a brute force search.

Take the shortest algorithm which worked in each of these tests, and use it to answer your original SAT query of size T (aborting and outputing 0 if it takes more than T^k time).

This entire process takes time poly(T). Moreover, suppose there is an N^k time algorithm which solves SAT correctly and let A be the shortest such algorithm. Then there is a finite constant T0 such that all shorter algorithms than A fail on some SAT instance of size at most T0. Then the algorithm I described works correctly for all SAT instances of size at least 2^(T0 + |A|).

(Edit: this argument is silly; you can just run all programs of size log(T) and output a solution if any of them find one)

Edit: Note in particular that the algorithm which takes k = BB(10^10) and does a brute force search instead for T < BB(10^10) is guaranteed to solve SAT in poly time, provided that any algorithm does. Realistically, I think taking k = 10 and doing a brute for search for T < 10^10^10 is virtually guaranteed to solve SAT in poly time, if any algorithm does (and I can actually write down this latter algorithm).

Comment author: AlephNeil 08 December 2011 08:25:33PM *  1 point [-]

Perhaps a slightly simpler way would be to 'run all algorithms simultaneously' such that each one is slowed down by a constant factor. (E.g. at time t = (2x + 1) * 2^n, we do step x of algorithm n.) When algorithms terminate, we check (still within the same "process" and hence slowed down by a factor of 2^n) whether a solution to the problem has been generated. If so, we return it and halt.

ETA: Ah, but the business of 'switching processes' is going to need more than constant time. So I guess it's not immediately clear that this works.

Comment author: dlthomas 08 December 2011 08:13:31PM 0 points [-]

Very interesting.