Dorikka comments on Stupid Questions February 2015 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (198)
Or you just be the type of person that would tell it to go fuck itself, try to destroy it, and leave it boxed or maximally constrain it if you can't destroy it. If you cannot credibly commit to this or a similar threat resistant variant, no one should ever let you near a boxed AI and you should never want to go near one as you will likely be using a suboptimal strategy.