Dmytry comments on Taking Ideas Seriously - Less Wrong

51 Post author: Will_Newsome 13 August 2010 04:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (257)

You are viewing a single comment's thread. Show more comments above.

Comment author: Grognor 22 March 2012 01:53:43PM 0 points [-]

The compartmentalization of information is anything but safe.

I agree in most cases; however, there are some cases where ideas are very Big and Scary and Important where a full propagation through your explicit reasoning causes you to go nuts. This has happened to multiple people on Less Wrong, whom I will not name for obvious reasons.

I would like to emphasize that I agree in most cases. Compartmentalization is bad.

Comment author: Dmytry 22 March 2012 02:04:24PM *  0 points [-]

I think it happens due to ideas being wrong and/or being propagated incorrectly. Basically, you would need extremely high confidence in a very big and scary idea, before it can overwrite anything. The MWI is very big and scary. Provisionally, before I develop moral system based on MWI, it is perfectly consistent to assume that it has probability of being wrong, q, and the relative morality of actions, unknown under MWI, and known under SI, does not change, and consequently no moral decision (involving comparison of moral values) changes before there is a high quality moral system based on MWI. As a quick hack moral system based on MWI is likely to be considerably incorrect and lead to rash actions (e.g. quantum suicide that actually turns out to be as bad as normal suicide after you figure stuff out)

The ship is compartmentalized against hole in the hull, not against something great happening to it. Incorrect idea with high confidence can be a hole in the hull; the water be the resulting nonsense overriding the system.