You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

MrMind comments on Open thread, Jul. 25 - Jul. 31, 2016 - Less Wrong Discussion

3 Post author: MrMind 25 July 2016 07:07AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (133)

You are viewing a single comment's thread. Show more comments above.

Comment author: Romashka 27 July 2016 12:59:51PM 0 points [-]

You know, it would be funny to imagine a word with knowledge being passed through some oracle by a so-called antiport, that is, you get a true, nonmalignant, relevant answer to your question about what happens at time X - but everybody forgets one and the same random thing until that moment, of unknown meta-ness:)

Comment author: MrMind 04 August 2016 09:15:49AM 0 points [-]

I'm not sure I understand the model you are proposing, can you elaborate with a concrete example? It might be interesting enough to come up with a short story about it.

Comment author: Romashka 17 August 2016 06:57:47AM 1 point [-]

I can't really imagine information disappearing... Maybe something like, "I will answer if you taboo a certain notion until a certain time in the future, and I will not say more unless you agree. If you agree and defect, the answer will become false as soon as I can make this happen, and there will be no further transactions"?

Comment author: MrMind 18 August 2016 02:28:02PM 0 points [-]

I think I can make this work :)