Manfred comments on Thoughts on the Singularity Institute (SI) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (1270)
On what measure of difficulty are you basing this? We have some guys around here doing a pretty good job.
I phrased that with too much certainty. While I have little if any reason to see fully-reflective decision theory as an easier task than self-consistent infinite set theory, I also have no clear reason to think the contrary.
But I'm trying to find the worst scenario that we could plan for. I can think of two broad ways that Eliezer's current plan could be horribly misguided:
Now SI technically seems aware of both problems. The fact that Eliezer went out of his way to help critics understand Löb's Theorem and that he keeps mentioning said theorem seems like a good sign. But should I believe that SI is doing enough to address #2? Why?