lessdazed comments on Students asked to defend AGI danger update in favor of AGI riskiness - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (38)
This is less than surprising. I can't think of any threat already existing in the minds of some undergraduates that a competent believing professor requiring attendance couldn't, on average, increase. Control groups are needed.
What would you want to do with the control groups? Teach them that AGI won't destroy the world? Not teach them anything in particular about AI? Teach them that invading aliens will destroy the world, or that the biblical End Times are near? Any of these would yield useful information. Which one(s) do you favor?
I was specifically thinking of these exact two conditions, which is why I said "groups", for they are different in kind. The aliens example is even better than the supernatural end times one.
I thought of but rejected "Teach them that AGI won't destroy the world?" when I couldn't think of how to implement that neutrally. How would one do that?
True. Most arguments against the AGI-apocalypse scenario are responses to arguments for it; it would be difficult to present only one side of the question.