You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

PhilipL comments on Ritual Report: Boston Solstice Celebration - Less Wrong Discussion

8 Post author: Vika 27 December 2013 03:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (19)

You are viewing a single comment's thread.

Comment author: [deleted] 27 December 2013 08:40:41PM 5 points [-]

Nitpick: Does it bug anyone else to apply the Litany of Tarski on statements with undefined truth values (e.g. numbers 2-4 above), or is it just me?

Comment author: somervta 27 December 2013 10:24:03PM 2 points [-]

If by undefined you mean that we don't know what the value is, then no, it doesn't bother me. If by undefined you mean that they have no truth value, as standard, then I don't think they are undefined.

Comment author: Raemon 28 December 2013 02:00:55AM 0 points [-]

Yeah, I actually regretted those choices last year. I ended up not using the Litany of Tarski at the Big Solstice this year, but if I had, I'd have stuck to things where the truth couldn't be dependent on what people believed the truth was.

Comment author: Nisan 28 December 2013 09:06:13PM *  2 points [-]

Oh, I'm surprised. Do you mean that, e.g., being "outcompeted by simulated brains in a Malthusian hellhole race to the bottom" is less likely if more people believe that being "outcompeted by simulated brains in a Malthusian hellhole race to the bottom" is more likely, thus leading to an inconsistency?

Awareness-raising is really important and high-value, but unfortunately it only makes a marginal dent in x-risk. I mean, it may be rational to anticipate a scarily high probability of a Malthusian hellhole race to the bottom whether or not we personally try to stop it. The Solstice hymnal and ritual made me realize that on an emotional level.

On the other hand, maybe the probability of existential disasters conditioning on our best efforts is a thought that's too demoralizing even for a Solstice ritual.

Comment author: Nisan 28 December 2013 09:13:18PM 2 points [-]

Fun fact! Paradoxical propositions that are true if and only if you don't believe them are at the heart of Tarski's theorem on the undefinability of truth, and MIRI has figured out a way to make sense of them. Basically, if proposition P is true if and only if you assign less than 10% probability to P, then you ought to assign probability 10% to P, and you ought to believe that you assign probability "approximately 10%" to P.

Comment author: Raemon 29 December 2013 09:17:24PM 1 point [-]

Huh. Couple thoughts:

1) Solstice is meant to be scary. (How scary exactly depends on which crowd we're doing it for). "The world may end and it may in fact be dependent on our actions" is a primary point to it.

2) You're on a short list of people who have described the Solstice as actually helping you realize things on an emotional level, which was an intended purpose. So, good to know.

3) On one hand, "Outcompeted by simulated brains in a Malthusian Hellhole race to the bottom"'s probability may not depend that much on our personal actions, and framing the question is useful. But I did find it distracting to notice that the outcome dependent at least somewhat on my beliefs, and also might depend on the collective beliefs of everyone who attends Solstices, and I should take responsibility for that.

I also think it's useful to distinguish between Epistemic Rationality Rituals and Instrumental Rationality Rituals.

4) Re: your other comment about Tarski's theorem - interesting. Kind of wrapping my brain around that now.

Comment author: Vika 27 December 2013 09:32:25PM 0 points [-]

Why do these statements have undefined truth values?

Comment author: [deleted] 27 December 2013 11:48:50PM 1 point [-]

See somervta's comment above. But, I disagree with them on their second point.

If, in response to "If I'm (not) going to be outcompeted by simulated brains, I desire to (not) believe...", I asked you "Am I going to be outcompeted by simulated brains?" you probably wouldn't say "yes" or "no". There's no territory to match up with the map, i.e. your belief of whether or not we'll be outcompeted.

I don't know... Maybe people define territory differently, to include events that haven't happened and things that don't exist yet?

Comment author: Nisan 28 December 2013 12:47:19PM *  1 point [-]

Yep! Check out the B-theory of time.

Comment author: Vika 28 December 2013 07:56:52PM -1 points [-]

You can say something like "if I am going to be outcompeted by simulated brains in X% of Everett branches", which is part of the territory (if you accept many-worlds), but is not verifiable. I agree that it's better to stick with testable statements, especially if introducing people to the Litany of Tarski, so we will be more careful with this for next year's Solstice.

Comment author: shminux 27 December 2013 10:08:22PM -1 points [-]

What numbers and where are you referring to? I only see a bunch of song titles.

Comment author: Vika 27 December 2013 10:25:38PM 0 points [-]

PhilipL is referring to the second, third and final Litanies of Tarski (in the Twilight section and the second Light section).

Comment author: shminux 28 December 2013 05:43:49AM *  -1 points [-]

Ah, I see, thanks. I have to agree with PhilipL that applying the template to a possible future event turns the original meaning upside down. Unless maybe if you subscribe to Eliezer's idiosyncratic timeless "block universe" view.

Comment author: alicey 28 December 2013 05:03:57PM -1 points [-]

note: shminux is a particularly vocal individual who strongly disagrees with the timeless "block universe" model

Comment author: shminux 28 December 2013 07:14:44PM -1 points [-]

I don't agree or disagree with untestables.