You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Tem42 comments on Survey Article: How do I become a more interesting person? - Less Wrong Discussion

5 Post author: casebash 18 October 2015 10:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (44)

You are viewing a single comment's thread. Show more comments above.

Comment author: gjm 21 October 2015 01:37:11PM 0 points [-]

Nuclear war could do us a lot of damage, but it's pretty unlikely to drive us completely extinct. And I think nuclear war -- especially the sort of really big nuclear war that has any chance of driving the human race near to extinction -- is fairly unlikely because it's so obviously not in anyone's interest.

(Note that I didn't claim that those predictions are certainly right.)

Tangentially, it occurs to me that large-scale nuclear annihilation might make for interesting bullet-biting test cases for exotic decision theories. Suppose, e.g., that you're interacting with some other agent and you can see one another's source code (or have other pretty reliable insight into one another's behaviour). A situation might arise in which your best course of action is to make a credible threat that in such-and-such circumstances you will destroy the world (meaning, e.g., launch a large-scale nuclear attack that will almost certainly result in almost everyone on both sides dying, etc.). Of course those circumstances have to be very unlikely given your threat. Theories like TDT then say that in those circumstances you should in fact destroy the world, even though at that point there is no possible way for doing so to help you. So, do you do it?

(UDT, which I think is the generally preferred TDT-like theory these days, says more precisely that you should arrange to be governed by an algorithm that in those circumstances will destroy the world. What you do if those circumstances then arise isn't a separate question. I think that takes some of the psychological sting out of it -- though deliberately programming yourself so that in some foreseeable situations you will definitely destroy the world is still quite a bullet to be biting.)

Comment author: Tem42 21 October 2015 11:38:50PM -1 points [-]

Stanislav Petrov may be relevant here.