WrongBot comments on Some Thoughts Are Too Dangerous For Brains to Think - Less Wrong

15 Post author: WrongBot 13 July 2010 04:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (311)

You are viewing a single comment's thread. Show more comments above.

Comment author: Tyrrell_McAllister 14 July 2010 01:32:53AM *  3 points [-]

This is an example of a pretty different kind of thing to what WrongBot is talking about.

I think that you can just twiddle some parameters with my example to see something more like WrongBot's examples. My example had a known deadline, after which I knew it would be safe to read the reports. But suppose that I didn't know exactly when it would be safe to read the reports. My current project is the sort of thing where I don't currently know when I will have done enough. I don't yet know what the conditions for success are, so I don't yet know what I need to do to create safe conditions to read the reports. It is possible that it will never be safe to read the reports, that I will never be able to afford the distraction of suppressing my brain's desire to compose responses.

My understanding is that WrongBot views group-intelligence differences analogously. The argument is that it's not safe to learn such truths now, and we don't yet know what we need to do to create safe conditions for learning these truths. Maybe we will never find such conditions. At any rate, we should be very careful about exposing our brains to these truths before we've figured out the safe conditions. That is my reading of the argument.

Comment author: WrongBot 14 July 2010 03:35:46AM 4 points [-]

More or less. I'm generally sufficiently optimistic about the future that I don't think that there are kinds of true knowledge that will continue to be dangerous indefinitely; I'm just trying to highlight things I think might not be safe right now, when we're all stuck doing serious thinking with opaquely-designed sacks of meat.