It's my contention that rationality should offer guidance in figuring out what goals you should have. A rationalist society will have goals closer to "defeat death and grasp the stars" than "gain ALL the status". It's not just rationalists who should win, it's rational societies who should win. If you're in a society that is insane then you may not be able to "win" as a rationalist. In that case your goal should not be "winning" in the social-traditional sense, it should be making society sane.
You're priveliging your values when you judge which society - the status game players versus the immortal starfarers - is "winning".
Will Nick and Laura be there?
Tragically no. Sorry I never got back to you last week, I didn't get your text until the next day.
We both have a super busy fall semester, so we've been missing the meetups.
You formulate it as if reading the sequences was a necessary condition to participate in LW. It isn't.
It's not a written rule by any means, but in order to acclimate to the style and reduce inferential distances it's usually a good idea.
Hot diggity damn. I missed the announcement of the last Austin meetup and thought I was going to have to start a chapter myself. I will definitely be there.
Hey! That's great. Excited to meet you :)
We will probably have another next week at the same time. I will PM you whenever I settle upon new dates.
I won't claim it was hugely successful - no one else showed up, but I did get a couple of people at my hackerspace interested, and some vague promises of people coming next week. Still, I think there's potential here. We'll see.
I'm sorry I missed it. I'll check in regularly for info on the next one.
Austin LW'er here. I totally would have come to this, but somehow I missed the announcement in the meta-meetup thread. I will watch for the next one of these and absolutely be there.
Edit: Wait... why was this downvoted?
Great meetup! I liked that the others were so willing to work with me on instantiating correct reasoning for my cost-benefit analysis for the "Rationality Boot Camp". Person:SilasBarta was not nearly as anti-paperclip in physical interaction as User:SilasBarta is on this internet website.
The humans I met really are a credit to your race!
It was great to meet you and the others Clippy!
Seeing you and User:SilasBarta interact reminded me that one must be careful when interpreting tone in internet comments. You were both polite and reasonable, and I thoroughly enjoyed the discussions.
Thank you. You are a good human, and a large part of why I have not canceled in light of User:SilasBarta's racist comments. See you at the meetup!
I look forward to meeting you (or your representative) tomorrow :)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I don't think that that's a bad thing. The immortal starfarers necessarily go somewhere; the status game players don't necessarily go anywhere. Hence "winning". The point of the post was to warn that not only answering our questions but figuring out which questions we should ask is an issue we have to tackle. We have to figure out what winning should be.
The reason that the immortal starfarers are better is that they're trying to do that, so if all values aren't created equally, they're more likely to find out about it.
Deciding that going somewhere is "winning" comes from your existing utility function. Another person could judge that the civilization with the most rich and complex social hierarchy "wins".
Rationality can help you search the space of actions, policies, and outcomes for those which produce the highest value for you. It cannot help you pass objective judgment on your values, or discover "better" ones.