You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Stuart_Armstrong comments on An Oracle standard trick - Less Wrong Discussion

4 Post author: Stuart_Armstrong 03 June 2015 02:17PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (33)

You are viewing a single comment's thread. Show more comments above.

Comment author: Epictetus 04 June 2015 10:08:31PM 1 point [-]

Would there be any unintended consequences? I'm worried that possessing an incorrect belief may lead the Oracle to lose accuracy in other areas.

For instance, if accuracy is defined in terms of the reaction of the first person to read the output, and that person is isolated from the rest of the world, then we can get the Oracle to act as if it believed a nuclear bomb was due to go off before the person could communicate with the rest of the world.

In this example, would the imminent nuclear threat affect the Oracle's reasoning process? I'm sure there are some questions whose answers could vary depending on the likelihood of a nuclear detonation in the near future.

Comment author: Stuart_Armstrong 05 June 2015 11:46:46AM 2 points [-]

The Oracle does not possess inaccurate beliefs. Look at http://lesswrong.com/lw/ltf/false_thermodynamic_miracles/ and http://lesswrong.com/r/discussion/lw/lyh/utility_vs_probability_idea_synthesis/ . Note I've always very carefully phrased it as "act as if it believed" rather than "believed".