siduri comments on To what degree do you model people as agents? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (130)
I find this comment...very, very fun, and very, very provocative.
Are you up for -- in a spirit of fun -- putting it to the test? Like, people could suggest goals that the successful completion of which would potentially label themselves as "an Intelligence" according to Eliezer Yudkowky -- and then you would outline how you would do it? And if you either couldn't predict the answer, or we did it in a way different enough from your predictions (as judged by you!), we'd get bragging rights thereafter? (So for instance, we could put in email sigs, "An intelligence, certified Eliezar Yudkowky." That kind of thing.)
A few goals right off the top of my head:
Basically the idea is that we get to posit things we think we know how to do and you don't... and you get to posit things that you don't know how to do but would like to...and then if we "win" we get bragging rights.
There's pretty obviously some twisted incentives here (mostly in your favor!) but we'll just have to assume that you're a man of honor. And by "a man of honor" I mean "a man whose reputation is worth enough that he won't casually throw a match."
I dunno, does that sound fun to anybody else?