Exactly - if anything I am trying to make the job seem less appealing than it will be, so we attract only the right kind of person.
I see people are highly upvoting the post, even correcting for the Bostrom's halo effect, so I'm updating a bit in the direction of you being right. I also see that you've followed Lachouette suggestion, and I like it.
I would be genuinely curious to see if it worked as intended in the end, might change the way in which I conduct job interviews a bit (I obviously realize that this is an irrelevant request that will probably not be met).
Best of luck with the recruiting.
If funding were available, the Centre for Effective Altruism would consider hiring someone to work closely with Prof Nick Bostrom to provide anything and everything he needs to be more productive. Bostrom is obviously the Director of the Future of Humanity Institute at Oxford University, and author of Superintelligence, the best guide yet to the possible risks posed by artificial intelligence.
Nobody has yet confirmed they will fund this role, but we are nevertheless interested in getting expressions of interest from suitable candidates.
The list of required characteristics is hefty, and the position would be a challenging one:
The research Bostrom can do is unique; to my knowledge we don't have anyone who has made such significant strides clarifying the biggest risks facing humanity as a whole. As a result, helping increase Bostrom's output by say, 20%, would be a major contribution. This person's work would also help the rest of the Future of Humanity Institute run smoothly.