Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Shit Rationalists Say?

32 Post author: Eneasz 25 January 2012 09:51PM

I assume everyone has run across at least one of the "Shit X's Say" format of videos? Such as Shit Skeptics Say. When done right it totally triggers the in-group warm-fuzzies. (Not to be confused with the nearly-identically formatted "Shit X's Say to Y's" which is mainly a way for Y's to complain about X's).

What sort of things do Rationalists often say that triggers this sort of in-group recognition which could be popped into a short video? A few I can think of...

You should sign up for cryonics. I want to see you in the future.

…intelligence explosion…

What’s your confidence interval?

You know what they say: one man’s Modus Ponens is another man’s Modus Tollens

This may sound a bit crazy right now, but hear me out…

What are your priors?

When the singularity comes that won’t be a problem anymore.

I like to think I’d do that, but I don’t fully trust myself. I am running on corrupted hardware after all.

I want to be with you, and I don’t foresee that changing in the near future.

…Bayesian statistics…

So Omega appears in front of you…

What would you say the probability of that event is, if your beliefs are true?

 

Others?

Comments (116)

Comment author: Alicorn 25 January 2012 10:39:35PM 63 points [-]

The "confidence interval" line should have a percentage ("What's your 95% confidence interval?").

"You make a compelling case for infanticide."

"Can you link me to that study?"

"I think I'm going to solve psychology." ("I think I'm going to solve metaethics." "I think I'm going to solve Friendliness.")

"My elephant wants a brownie."

"Is that your true rejection?"

"I wanna be an upload!"

"Does that beat Reedspacer's Lower Bound?"

"Let's not throw all our money at the Society for Rare Diseases in Cute Puppies."

"I have akrasia."

"I'm cryocrastinating."

"Do that and you'll wind up with the universe tiled in paperclips."

"So after we take over the world..."

"I want to optimize for fungibility here."

"This looks like a collective action problem."

"We can dissolve this question." ("That's a dissolved question.")

"My model of you likes this."

"Have you read Goedel, Escher, Bach?"

"What do the statistics say about cases in this reference class?"

Comment author: Alicorn 25 January 2012 10:54:42PM *  50 points [-]

"We need whiteboards."

"I'm trying paleo."

"I might write rationalist fanfiction of that."

"That's just an applause light." ("That's just a semantic stopsign." "That's just the teacher's password.")

"POLITICS IS THE MINDKILLER"

"If keeping my current job has higher expected utility than founding a startup, I wish to believe that keeping my current job has higher expected utility than founding a startup..."

"I think he's just being metacontrarian."

"Arguments are soldiers!"

"Not every change is an improvement, but every improvement is a change."

"There are no ontologically basic mental entities!"

"I'm an aspiring rationalist."

"Fun Theory!"

"The map is not the territory."

"Let's beware evaporative cooling, here."

"It's a sunk cost! Abandon it!"

"ERROR: POSTULATION OF GROUP SELECTION DETECTED"

"If you measure it and reward the measurement going up, you'll get what you measure, not what you want."

"Azahoth!"

"Death is bad."

Comment author: Alicorn 25 January 2012 11:16:56PM 44 points [-]

This is too much fuuuuuuuun

"She's just signaling virtue."

"Money is the unit of caring."

"One-box!"

"Beliefs should constrain anticipations."

"Existential risk..."

"I'll cooperate if and only if the other person will cooperate if and only if I cooperate."

"I'm going to update on that."

"Tsuyoku naritai!"

"My utility function includes a term for the fulfillment of your utility function."

"Yeah, it's objective, but it's subjectively objective."

"I am a thousand shards of desire."

"Whoa, there's an inferential gap here that one of us is failing to bridge."

"My coherent extrapolated volition says..."

"Humans aren't agents." ("I'm trying to be more agenty." "Humans don't really have goals.")

"Wait, wait, this is turning into an argument about definitions."

"Look, just rejecting religion and astrology doesn't make someone rational."

"No, no, you shouldn't implement Really Extreme Altruism. Unless the alternative is doing it without, anyway..."

"I'll be the Gatekeeper, you be the AI."

"That's Near, this is Far."

"Don't fall into bottom-line thinking like that."

Comment author: Alicorn 25 January 2012 11:39:31PM 36 points [-]

I think I'm done. If I think of any more I'll add them to this comment instead of making a new one.

"How do you operationalize that?"

"'Snow is white' is true if and only if snow is white."

"If I may generalize from one example here..."

"I'm suffering from halo effect."

"Warning: Dark Arts."

"Okay, but in the Least Convenient Possible World..."

"We want to raise the sanity waterline."

"You've fallen prey to the illusion of transparency."

"Bought some warm fuzzies today."

"What does the outside view say?"

"So the idea is that we make all scientific knowledge a sacred and closely guarded secret, so it will be treated with the reverence it deserves!"

"How could you test that belief?"

Comment author: arundelo 26 January 2012 12:04:19AM 11 points [-]

RATIONALISTS SAY ALL THE THINGS!

Comment author: Viliam_Bur 26 January 2012 10:11:14AM 20 points [-]

RATIONALISTS SAY ALL THE THINGS!

Solomonoff prior gives you 50%, that's pretty cool! :D

I hope someone will use Alicorn's (and other) quotes to make a good Eliza-bot. This could be an interesting AI challenge -- write a bot that will get positive karma on LW! If there are more bots, the bot with highest nonzero karma wins.

Comment author: Alejandro1 26 January 2012 05:38:52PM 48 points [-]

As a start, I copied all Alicorn's lines into a Markov text synthesizer . Some of the best results were:

"Whoa, there's an improvement, but it's subjectively objective."

"Okay, but every change is turning into bottom-line thinking like a collective action problem."

"If keeping my current job has higher expected utility than founding a brownie."

"I think he's just the unit of you shouldn't implement Really Extreme Altruism. Unless the teacher's password."

"If I wish to optimize for Rare Diseases in paperclips."

"My elephant wants a term for infanticide."

Comment author: TheOtherDave 26 January 2012 07:16:54PM 35 points [-]

I burst out laughing while reading this, so of course my officemates wanted to know what was so funny.

I cannot remember the last time the gulf of inferential distances was so very very wide.

Comment author: Cthulhoo 27 January 2012 10:36:58AM *  8 points [-]

Not totally IT, but I tried it on Eliezer's "The 5-Second Level". Highlits include:

I won't socially kill you

Hope to reflect on consequentialist grounds

Say, what a vanilla ice cream, and not-indignation, and from green?

Associate to persuade anyone of how you were making the dreadful personal habit displays itself in a concrete example.

Rather you can't bear the 5-second level?

To develop methods of teaching rationality skills, you need more practice to get lost in verbal mazes; we will tend to have our feet on the other person.

Be sufficiently averse to the fire department and see if that suggests anything.

Comment author: DanielH 11 November 2013 11:14:08AM 0 points [-]

Be sufficiently averse to the fire department and see if that suggests anything.

I do believe it suggests libertarianism. But I can't be sure, as I can't simply "be sufficiently averse" any more than I can force myself to believe something.

Still, that one seems to be a fairly reasonable sentence. If I were to learn only that one of these had been used in an LW article (by coincidence, not by a direct causal link), I would guess it was either that one or "I won't socially kill you".

Comment author: AlexanderRM 02 October 2015 11:17:46PM 0 points [-]

I would be amazed if Scott Alexander has not used "I won't socially kill you" at some point. Certainly he's used some phrase along the line of "people who won't socially kill me".

...and in fact, I checked and the original article has basically the meaning I would have expected: "knowing that even if you make a mistake, it won't socially kill you.". That particular phrase was pretty much lifted, just with the object changed.

Comment author: Randaly 13 February 2012 05:22:14PM *  18 points [-]

From another generator:

"I'm going to solve metaethics." "I'm going, you're going to found the Society for infanticide."

""Snow is white" is failing to solve psychology."

"Wait, wait, "this is white" is a more technical explanation?"

"My utility function includes a semantic stopsign."

"If keeping my current job has little XML tags on it that say the Least Convenient Possible World...""

"Sure, I'd take over the sanity waterline."

"I'll be the symbol with ice cream trees."

"So after we take over the alternative universe that is the Least Convenient Possible World..."

"I want to tile the sanity waterline with the unit of a thing."

Comment author: Multiheaded 26 January 2012 05:48:43PM *  6 points [-]

"If I wish to optimize for Rare Diseases in paperclips..."

If we had signatures on LW, this would be mine.

Comment author: fubarobfusco 26 January 2012 07:07:40PM *  16 points [-]

Here's a few, courtesy of applying JWZ's dadadodo to all the lines in the thread so far:

What does the best textbook on corrupted hardware. Dark Arts; Escher, Bach?

How could you credibly pre commit to see you as a compartmentalized belief?

I'm trying to be a cult.

Have super powers.

You've fallen prey to be condescending.

My current job has higher expected utility than you imagine; but in the sanity waterline.

Everyone is Far.

No idea is reliable? Have a lot of caring.

Conceptspace is the future.

Mysteriousness is a cult.

I'm going to be with the AI. I know the universe future.

Look, just generalize from the territory.

Everyone is bigger than you in Cute Puppies.

Emacs' M-x dissociated-press yields babble, but with some interesting words in it: "knowledgeneralize", "metacontrammer", "contrationalist", "choosequences", "the universal priory", "statististimate", "fanfused", "condescendista", "frobability", "dissolomonoff", "optimprovement", "estimagine", "cooperaterline", "pattern matchology". The only sensible sentence it's come up with is "I'm running on condescending".

Comment author: Multiheaded 26 January 2012 07:15:37PM 9 points [-]

No idea is reliable? Have a lot of caring.

I visualized that being said simultaneously with the middle-finger gesture.

Comment author: thomblake 26 January 2012 07:36:36PM *  5 points [-]

the universal priory

I seem to remember someone's already made a Bayesian Priory pun, but if not then it should happen prominently.

EDIT: here

Comment author: Multiheaded 26 January 2012 11:28:50PM 2 points [-]

Conceptspace is the future.

Wrong. Electronic old men are.

Comment author: fubarobfusco 26 January 2012 11:55:41PM *  3 points [-]

To give an idea of what these look like raw, here's a paragraph of dadadodo:

What does the universe with ice cream trees. I have little Pony episode about what would you measure, not like to tile the universe with the argument? That's just signaling virtue: Death is bad. That's just a startup. I have little XML tags on corrupted hardware. Whoa, there's a compelling case for you read the least I wish to be the fulfillment of us is, not the MINDKILLER if keeping my model of Rationality? Tsuyoku naritai! So after all over the bad; result, you're running on that.

And here's a similar-sized chunk of M-x dissociated-press:

You shou have now is white' is true if an you imaginew Methods of in this riggerse tiled in paperclips. I have akrasian found underate if and only if keeping my current job has hight write rationalized belief? I cause can have you regenerate that say 'moral' It will bach? What die. You shods of Rationalith a good cause coherent extrapolass? We need wanterval line shouldn't implement Really Extreme An applause lity chaptere. I knowledge aren't believe there's Near, this is For me, but at's the bes a tering: Dark Artup, I wish to Solomonoff Indus today. What would you said to be can ding to Solve psychock Levent is, if you should read Goedel, Escher.

Of these, I rather like:

I have little XML tags on corrupted hardware.
shods of Rationalith
extrapolass

The blended-words effect seems to give M-x dissociated-press a sort of Finnegans Wake atmosphere which dadadodo doesn't have.

Comment author: thomblake 26 January 2012 04:31:50PM 15 points [-]

I hope someone will use Alicorn's (and other) quotes to make a good Eliza-bot.

Surely you mean Eliezer-bot.

Comment author: Eliezer_Yudkowsky 01 June 2012 04:30:46PM 8 points [-]

Should it be made, it will of course be known as Elieza.

But in any case I think you need to keep in mind that a blank map does not correspond to a blank territory.

Comment author: thomblake 01 June 2012 06:19:58PM 3 points [-]

I initially read the parent in a straightforward way, but then I noticed it is also a meta-joke.

Comment author: wedrifid 02 June 2012 11:06:56PM 2 points [-]

But in any case I think you need to keep in mind that a blank map does not correspond to a blank territory.

Usually. It could.

Comment author: loup-vaillant 10 October 2012 04:08:14PM -1 points [-]

What is your prior? (For Eliezer being empty.)

Comment author: dbaupp 26 January 2012 10:41:39AM 5 points [-]
Comment author: Raemon 25 January 2012 10:42:04PM *  9 points [-]

Imagining this in my head has sold me on this being a good idea. Or at least a mildly amusing idea that will have relatively minor negative externalities. (I'm reminded of Eliezer Facts)

Comment author: gwern 26 January 2012 12:40:33AM 2 points [-]
Comment author: CronoDAS 26 January 2012 06:50:14AM 8 points [-]

"My model of you likes this."

I like this one. Mind if I actually use it?

Comment author: Alicorn 26 January 2012 07:15:56AM *  3 points [-]

Go for it. I say it to recommend things to people. (Mostly one person.)

Comment author: arundelo 15 June 2012 05:47:40PM *  6 points [-]

"My model of you likes this."

Gabe Newell (of Valve Software) wrote the following in an email to Yanis Varoufakis (an economist):

Here at my company we were discussing an issue of linking economies in two virtual environments (creating a shared currency), and wrestling with some of the thornier problems of balance of payments, when it occurred to me "this is Germany and Greece", a thought that wouldn't have occurred to me without having followed your blog. Rather than continuing to run an emulator of you in my head, I thought I'd check to see if we couldn't get the real you interested in what we are doing.

Edit: And that reminds me of the Reamde character Richard Forthrast giving Zula Forthrast a job at his video game company because of her geology expertise.

Comment author: Karmakaiser 10 February 2012 09:05:15PM 5 points [-]

"You make a compelling case for infanticide."

I'm still laughing when I think about this one.

Comment author: Karmakaiser 04 May 2012 02:12:55AM *  10 points [-]

Update: Still laughing and using it in conversations.

Comment author: curiousepic 26 January 2012 05:57:27PM *  3 points [-]

I came into this thread with a negative set point because I see the "Shit X says" meme as thoroughly without value, being merely collections of stereotypes for no other purpose than to collect them. The OP confirmed this, and because my comment sorting had currently been on New, I scrolled through some of the comments, almost all of which continued to confirm this. Then I resorted to Top and saw your post, and my mind was immediately tickled. Some of these are genuinely funny, and in fact has value as a collection of LW memes and short rationality quotes.

Comment author: curiousepic 26 January 2012 05:53:20PM 3 points [-]

Sad as it is, this has potential to be effective outreach to Reddit, et al. Unless you'd like to do it yourself, or someone gives a good objection within a few days, I'll be posting it in one or more subreddits, perphaps including the GEB readthrough I'm participating in.

Comment author: Alicorn 26 January 2012 06:52:42PM 2 points [-]

I don't use Reddit. If there's interest in turning this into a video, I'm willing to film myself speaking some of my lines, but fear composing an entire video (ideally with several speakers) would take video editing skills and resources I don't have.

Comment author: Raemon 26 January 2012 08:16:38PM 4 points [-]

I actually want to film this, except I still think it has at least a 25% chance of turning out to be a horrible idea.

Comment author: TimS 27 January 2012 02:25:12AM 1 point [-]

"My elephant wants a brownie."

Could someone explain this reference?

Comment author: steven0461 27 January 2012 03:17:31AM 6 points [-]

It's a metaphor used in Jonathan Haidt's book The Happiness Hypothesis: the rider is the conscious or deliberative mind and the elephant is everything underneath.

Comment author: [deleted] 27 January 2012 03:25:43AM 3 points [-]

More to the point, the analogy is used in one of Luke's posts.

Comment author: wedrifid 02 June 2012 05:00:51AM 0 points [-]

"You make a compelling case for infanticide."

Is there an original source for this one?

Comment author: Zack_M_Davis 02 June 2012 05:34:18AM 2 points [-]
Comment author: wedrifid 02 June 2012 06:09:23AM 0 points [-]

Thanks Zack. I had a feeling I'd seen it before but couldn't recall the details.

Comment author: lessdazed 26 January 2012 01:12:15AM *  42 points [-]

"Did you just generalize from fictional evidence?"

"You're a one-boxer, right?" (Said with no context.)

"You'd choose specks, right?" (Said with no context.)

"Mysteriousness is not a property of a thing."

"You're running on corrupted hardware."

"Replace the symbol with the substance."

"Could you regenerate that knowledge?"

"Consider a group you feel prejudiced against, frequentists for example."

"But what's the best textbook on that subject?"

"Is that a compartmentalized belief?

"I notice I am confused."

"Of course I have super-powers. Everyone does."

"Beliefs are properly probabilistic."

"Is that your confidence level inside or outside the argument?"

"Did you credibly pre-commit to that rule?"

"That's just what it feels like from the inside."

"Conceptspace is bigger than you imagine."

"No you don't believe you believe that."

"No, money is the unit of caring."

"If that doesn't work out for you, you can still make six figures as a programmer."

"Purpose is not an inherent property."

"You think introspection is reliable?"

"Why didn't you use log-odds?"

Bullshit Rationalists Say:

"My priors are different than yours, and under them my posterior belief is justified. There is no belief that can be said to be irrational regardless of priors, and my belief is rational under mine."

"I pattern matched what you said rather than either apply the principle of charity or estimate the chances of your not having an opinion marking you as ignorant, unreasoning, and/or innately evil."

"Rational..." (used in the title of a post on any topic.)

Shit and Bullshit Rationalists Don't Say:

"You're entitled to your opinion."

"You can't be too skeptical"

"Absence of evidence is not evidence of absence."

"Did you read what Kurzweil wrote about the Singularity?

"100%."

"But was it statistically significant at the p<.05 level?"

"Yeah, I read all the papers cited in lukeprog's latest article."

Comment author: Karmakaiser 27 January 2012 03:22:11PM *  24 points [-]

"Yeah, I read all the papers cited in lukeprog's latest article."

A bunch of links almost no one clicks. It's like the Anti-TVTropes.

Comment author: Alicorn 26 January 2012 01:34:07AM 14 points [-]

"I notice I am confused."

I cannot believe I missed this one.

Comment author: komponisto 26 January 2012 03:18:08AM *  5 points [-]

Rationalists Don't Say:

"Absence of evidence is evidence of absence."

Wrong list.

Comment author: lessdazed 26 January 2012 03:25:02AM 3 points [-]

Brain fart.

This is relevant.

Comment author: lessdazed 26 January 2012 11:49:14PM 8 points [-]

"We played reference class tennis."

"Those are just more available to you, not actually more likely."

"Are you more an aspiring rationalist, 'aspiring rationalist,' 'aspiring' rationalist, or aspiring 'rationalist'?"

"The invisible is implied here."

"Is that a disjunctive or conjunctive event?"

"It seemed hard until I hacked away at the edges."

"You didn't time yourself thinking about it before proposing solutions?"

"I have something to protect."

"Someone should type a transcript of that."

"I don't know if that's still an open problem, I've been following the HPMOR thread instead of that one." (Said to a Philosophy professor about a philosophical problem.)

"Is there a more technical explanation?

"Argument screens off authority."

"Go ahead and try to 'other optimize' me."

"That's one of my ugh fields."

"That's not a property, it's a dangling variable."

"ADBOC."

Shit and Bullshit Rationalists Don't Say:

"Gwern hasn't summarized any research on that."

"Did Yvain even edit that before posting?"

"What are his/her credentials?"

"That's absurd!"

"Let's hope that's true."

"I've read more papers by Scott Aaronson than just the one." "Which one?" (Both of these.)

"All I want to know is the net positive or negative votes my comments and posts have received."

"I don't have an opinion as to which explanation of Bayes' theorem I'd recommend."

Comment author: Psychosmurf 26 January 2012 04:52:14PM 0 points [-]

"100%."

Oh man, had me laughing for a good while with this one. Nice job! ^_^

Comment author: [deleted] 10 October 2012 03:54:09PM 1 point [-]

"Absence of evidence is not evidence of absence."

To be fair, this is true if you interpret "absence of evidence" as meaning "absence of evidence in either direction".

Comment author: XFrequentist 26 January 2012 07:54:18PM 1 point [-]

The last section is amazing!

Comment author: RolfAndreassen 25 January 2012 10:25:38PM 30 points [-]

"Suppose we were all playing Prisoner's Dilemma with clones of ourselves..."

I heard this said at the Ohio meetup on Sunday; Yvain commented that, of all the meetups he'd been to, ours took the longest to reach that point.

Comment author: RomeoStevens 28 January 2012 10:05:06AM 26 points [-]

I've thought a "shit your brain says" might be a good way of compactly presenting some cognitive biases.

Comment author: MugaSofer 09 January 2013 01:42:46PM 0 points [-]

Did anything come of this excellent idea?

Comment author: RomeoStevens 09 January 2013 10:23:47PM 0 points [-]

No, didn't realize this got so upvoted. The fad is a bit past. I still want cartoons on the level of RSA Animate for the sequences.

Comment author: BrienneYudkowsky 03 January 2015 04:34:53PM 1 point [-]

I think this should still happen.

Comment author: RomeoStevens 03 January 2015 11:41:20PM 0 points [-]

RSA animate sequences or "shit your brain says"? Shit your brain says could probably be pulled off fairly easily in a couple days. I should talk to Alton about it. The animations are a longer term project that are still on my list of projects to fund/work on if MealSquares gets bigger.

Comment author: Decius 03 January 2015 06:57:58PM 0 points [-]

Concur.

Comment author: MugaSofer 10 January 2013 09:48:33AM -1 points [-]

No

Shame. It's a good idea.

Comment author: arundelo 26 January 2012 12:03:09AM 25 points [-]

Here's one I say a lot: "Everyone is too young to die."

Comment author: mindspillage 26 January 2012 02:12:20AM *  23 points [-]

One huge category of utterances remains unrepresented:

"Ooh, is there a new Methods of Rationality chapter up yet?"

"I can't believe there's no new chapter yet."

"Have you read Methods of Rationality? You have to read it, OMG."

Comment author: Locke 26 January 2012 03:01:47AM 4 points [-]

Now that Eliezer is writing at a decent pace again I'm just desperate for progress updates.

Comment author: Xachariah 26 January 2012 09:53:30AM 15 points [-]

Your response just prompted me to check. I wonder if this is how the pigeons in Skinner's Box felt.

Comment author: Locke 26 January 2012 06:28:58PM 2 points [-]

The worst part is that unlike with actual updates we might get them at any time. So I check multiple times per day just to see if he's going to tell us how many words he wrote last night.

Comment author: shminux 26 January 2012 07:13:26AM 7 points [-]

No one said that in a long time.

Comment author: Raemon 26 January 2012 03:01:30PM 22 points [-]

Shit Rationalists Say:

'The thread should really be called "Shit LWers Say.'

Comment author: Anubhav 27 January 2012 01:22:34PM *  21 points [-]

Took every quote posted in this thread so far and Markov-ised it. (Depth of 1, i.e., each word only depends on the word before it. A depth of 2 basically regurgitates the input in this case.)

Here are some of the results:

"How do you regenerate that beat Reedspacer's Lower Bound?"

"It's a more karma. I interpreted your map, not a group you feel prejudiced against, frequentists for infanticide."

"Look, just generalize from halo effect."

"Okay, but in this is rational under them to make six figures as a cult."

"Beliefs are just an inferential distance and my belief is the territory."

"That's just a survey should do you measure, not throw all the scientific knowledge a dissolved question."

"Any organization with the teacher's password."

"That's just rejecting religion and closely guarded secret, so it but there's no idea how BIG thingspace is."

"Foo is the karma system more likely."

"You make six figures as a thousand shards of inferential distance and I cooperate."

"Conceptspace is white."

"Go ahead and only if and it's objective, but I think he's just what the Nazis did!"

"There's a Philosophy professor about definitions."

"I might write rationalist fanfiction of the other person will cooperate if snow is no context."

"Mysteriousness is true rejection?"

"I'd say politically incorrect about it before proposing solutions?"

"Arguments are you feel prejudiced against, frequentists for infanticide."

"I have to suppress dissent."

"Of course I wanna be the universe with rationality?"

"Money is an argument about my belief that one of a rationalist."

"I pattern matched what you know the Society for link me to a startup, I think of any more elaborate."

"'Snow is being used to say 'purpose'."

Comment author: MixedNuts 27 January 2012 01:40:48PM *  5 points [-]

Frequentists for infanticide! (I hope nobody takes this seriously. I mean, ew. Who'd support such horrible statistics?)

Comment author: Anubhav 27 January 2012 01:47:43PM 0 points [-]

Most of those quotes aren't even in the post anymore. :(

(Cut it down drastically because the wall of text was too high.)

Comment author: [deleted] 28 January 2012 02:32:42PM 0 points [-]

Awesome.

Comment author: NancyLebovitz 26 January 2012 09:17:22AM 19 points [-]

Cached thoughts are the mindkiller.

Comment author: TheOtherDave 26 January 2012 04:09:29PM 25 points [-]

I always think that.

Comment author: Anubhav 26 January 2012 02:51:28PM 17 points [-]

The thread should really be called "Shit LWers Say."

We're not the only group of people calling ourselves "Rationalists", nor are we the most well-known of these groups (not by a long shot).

Comment author: [deleted] 26 January 2012 02:23:04AM *  17 points [-]

I'm guilty of all of these:

"Cached thought!"

"That's a wrong question."

<unasked for link to sequence post>

"Have you read the Sequences?"

"According to Solomonoff Induction/the universal prior..."

"Stop prepending "rational" to post titles!"

"I know the forbidden idea, and it's not that bad."

"That's just a status/signalling game."

"There's a signalling hypothesis by Robin Hanson..."

"TDT/UDT says..."

Comment author: EchoingHorror 26 January 2012 09:30:18PM 16 points [-]

"I want to get my microexpressions analyzed so I can know what I'm thinking."

Comment author: Normal_Anomaly 27 January 2012 03:00:49AM 14 points [-]

"How do you know that?"

"Have I interpreted your position correctly?"

"That's uncomputable."

"That AI would destroy the world."

"That's a fact about your map, not about the territory."

"That's a duplicate quote."

"What does this have to with rationality?"

"That word doesn't carve reality at its joints."

"I want to do an AI Box experiment."

"Apply the reversal test! You're suggesting we kill all the old people."

"I wish I could self-modify to . . ."

"I'd say something politically incorrect about race, but it would start a flame war."

"I'd say something politically incorrect about gender, but it would start a flame war."

"I'd say something politically incorrect about my ability to say politically incorrect things, but it would start a flame war."

"Karma is being used to suppress dissent."

"Your position is incoherent unless you're a vegetarian."

"Rationality isn't that great."

"We should do a survey on this."

"That survey should have been bigger. We should do another survey."

"Way too much of the scientific literature is wrong."

"We need to make the karma system more elaborate."

"The negation of that statement sounds just as true."

"There's an Everett branch where . . ."

"I've seen research on this but I can't remember it."

"My personalities disagree on this."

"If this is a simulation, does that really matter?"

"My inner Robin Hanson says . . ."

Shit and/or Bullshit LessWrongers Don't Say:

"Let's agree to disagree."

"You're a Communist."

"Just trust me on this."

"That's just like what the Nazis did!"

"Can you explain that with less math?"

"I'm a rationalist."

"Haven't you seen Terminator?"

"But that's unnatural!"

"I don't really want more karma. I feel like I have enough."

Comment author: [deleted] 11 February 2012 03:30:37PM *  1 point [-]

"You're a Communist."

Some of us read Moldbug and Foseti so I'm not too sure... ;)

Comment author: Multiheaded 02 June 2012 06:39:49AM *  0 points [-]

(For the record, following your second link is what originally gave me the intuition that "fascist technocracy" might be a real problem, and that it's worth investigating. Before, I was mostly like: "Oh, nothing to worry about, it's just Mencius and he says creepy shit, but he's nice enough really.")

Comment author: Prismattic 26 January 2012 01:37:38AM 14 points [-]

Post title should be "Shit LWers say". Not all Bayesians sound like regulars on this website.

Comment author: NancyLebovitz 26 January 2012 09:16:27AM 4 points [-]

What's your prior that a Bayesian is also an LWer?

Comment author: DavidAgain 26 January 2012 07:14:51AM 1 point [-]

And not all rationalists are Bayesians, come to that.

Comment author: Jack 26 January 2012 01:37:48AM 27 points [-]

"Oops."

Comment author: Solvent 26 January 2012 03:04:21AM 29 points [-]

That's optimistic.

Comment author: Armok_GoB 13 February 2012 01:15:51PM 2 points [-]

moon explodes "Oops"

Comment author: FiftyTwo 26 January 2012 06:40:45AM 9 points [-]

"Source?"

Comment author: D_Alex 27 January 2012 08:27:52AM 5 points [-]

Nah, that's 4chan.

Comment author: Raemon 25 January 2012 10:57:30PM *  39 points [-]

"You should read the Sequences."
"It's not like Gandhi has little XML tags on that say 'moral' "
"It's not like natural selection put little XML tags there that say 'purpose'."
"Sure, I'd take a pill that made me bisexual."
"There's this really great fanfiction you should read."
"Oh man I wanna tile the universe with ice cream trees."
"I don't want to tile the universe with bananas and palm trees."

[Sequence that will be incredibly funny in context but is a terrible idea please for goodness' sake (literally) nobody film it]

"No, we're not a cult."
"We're not a cult or anything."
"It's not like organizations have little XML tags that say 'cult' and 'not cult'"
"Any organization with a good cause can have cult attractors"
"Look I'd explain it but there's a lot of inferential distance and I don't want to be condescending."
"We are not a cult. We are not a cult."

Comment author: Raemon 25 January 2012 11:52:59PM *  18 points [-]

"You have no idea how BIG mindspace is."
"You have no idea how BIG optimization process space is."
"You have no idea how BIG thingspace is."

Comment author: fubarobfusco 26 January 2012 01:00:17AM 22 points [-]

"Foo is not about bar."
"What odds are you offering on that?"
"Taboo 'monkey'."
"If you're not getting bad result, you're spending too many resources on avoiding it."
"Rationalists should win."
"Sorry, I'm running on corrupted hardware."

Comment author: maia 13 June 2012 12:12:42AM 7 points [-]

"You're going to [insert life plan here]? Why don't you just go work on Wall Street and donate your money to AMF?"

Comment author: Raemon 26 January 2012 04:12:36AM 17 points [-]

"So you know the My Little Pony episode about bayesian updating?"

Comment author: fubarobfusco 26 January 2012 05:41:28AM 3 points [-]

"'Feeling Pinkie Keen' comes so close!"

Comment author: Raemon 26 January 2012 06:15:04AM 9 points [-]

I got into an argument recently over whether that episode was good or terrible. (I believe it is a good episode, specifically because of the broader context of the show. Fellow rationalist in question watched that episode FIRST, with no context, which is just about the worst place to do so)

Regardless, I have a heuristic about how much I am allowed to argue about My Little Pony before I should feel bad about myself, and... I felt a little bad that day.

(That day was today).

Comment author: Eneasz 26 January 2012 06:09:19PM 9 points [-]

I did really hate the whole "Sometimes you just have to believe!" message. It felt almost like someone had thought MLP was becoming TOO rational and wanted to throw a bone to the Believer parents. :/

Comment author: FiftyTwo 22 October 2013 09:12:39PM 0 points [-]

Alternative interpretation is believing something you have strong evidence of that contradicts your pre-existing theories.

Comment author: MBlume 26 January 2012 07:44:11AM 1 point [-]

My brother saw the first of the cutie mark episodes first, he was unimpressed.

Comment author: Curiouskid 26 January 2012 02:11:17AM 16 points [-]

Consider this a study guide for newbies who want to measure how much of LW they understand.

Comment author: [deleted] 25 January 2012 10:25:52PM *  35 points [-]

If a "Shit Rationalists Say" thread would result in net positive utility, I want to believe that a "Shit Rationalists Say" thread would result in net positive utility.

If a "Shit Rationalists Say" thread would not result in net positive utility, I want to believe that a "Shit Rationalists Say" thread would not result in net positive utility.

Let me not become attached to beliefs I may not want.

Comment author: Nornagest 26 January 2012 12:46:38AM 5 points [-]

Well played.

Comment author: Raemon 25 January 2012 10:29:49PM 4 points [-]

This sums up my thinking.

Comment author: Rubix 09 February 2012 08:19:12PM 5 points [-]

"This, modulo that."

"It's not obvious that..."

"...is the obvious low-hanging fruit."

"HPJEV is my fantasy jerkass boyfriend."

"Eat this rock salt."

"Eat this potassium salt."

"Have you played Mage: The Ascention?"

"Can you be a CEV machine and order food for me?"

"Scumbag aliefs!"

"Scumbag subconscious!"

"Scumbag physics!"

"I want to want to do that."

Comment author: Alejandro1 26 January 2012 03:37:39AM 12 points [-]

Some line in the video (maybe the "corrupted hardware" one) should be said by someone with a desk full of piles of papers (connoting unfinished, urgent work) on topics like FAI, and a computer screen with open windows on TvTropes, Fanfiction.net, and a LW post on akraisa.

Comment author: Karmakaiser 27 January 2012 01:56:37AM *  11 points [-]

Rationalist PUA:

"What's you number? [...] No not that. Your Erdos number."

Rationalist insults:

"...Frequentist."

While twirling a paperclip: "I do not love you, nor do I hate you." (well, more of a threat.)

"I bet (73%) that you're a really consistent person. The sort of person whose decisions are final. Like say, in Monty Hall."

"Name three.."

Comment author: pedanterrific 27 January 2012 03:00:12AM 7 points [-]

While twirling a paperclip: "I do not love you, nor do I hate you."

More of a threat, surely?

Comment author: Karmakaiser 27 January 2012 03:15:24PM 1 point [-]

I could go into a long semantic argument putting threat as some subclass in insult but you're right. Oops.

Comment author: MBlume 25 January 2012 11:03:59PM 18 points [-]

"I know, it's pasta, and it's terrible for me, but at least I poured butter all over it."

Comment author: Raemon 25 January 2012 10:33:31PM 8 points [-]

I think I like this conceptually (I would get warm fuzzies if done right), except I'm trying to detach (in my own mind as well as others), the word "rationalist" from "the Less Wrong memeplex." Which is probably a pipe dream, and such a video would not contribute (much) to the issue one way or another, and "Shit Less Wrong folks say" doesn't have the same ring to it.

Comment author: thomblake 25 January 2012 10:12:31PM 8 points [-]

Wow, most of that looks really weird, and putting it all together like that seems to associate 'rationalist' with 'weird object-level beliefs'.

Comment author: Normal_Anomaly 03 June 2012 03:06:14PM 5 points [-]

There's a selection bias. We also say plenty of normal things, but they aren't unique to us, so they don't get put here.

Comment author: CharlesR 26 January 2012 06:34:26AM *  7 points [-]

"What Shock Level are you?"

Comment author: Raemon 26 January 2012 07:00:41AM 4 points [-]

This is the first one that stumps me.

Comment author: MBlume 26 January 2012 07:45:42AM 15 points [-]

That's because it's old -- more of a Shit SL4ers say

Comment author: CharlesR 26 January 2012 03:48:25PM *  2 points [-]

The question came up at the West LA LW Meetup. Only two people knew what it meant.

Comment author: Solvent 27 January 2012 12:16:23AM *  5 points [-]

From RationalWiki:

How to spot a LessWrongian in your sceptical discussion

In the outside world, the ugly manifests itself as LessWrong acolytes, minds freshly blown, metastasising to other sites, bringing the Good News for Modern Rationalists, without clearing their local jargon cache. Sceptical discussion spaces will often have people show up labelling themselves "rationalist" and being as irritating as teenage nerds who've just discovered Ayn Rand. Take one drink for each of:

"As rationalists, we should ..."

"As a rationalist, you should ..."

"Bayesian" (particularly when they seem to have picked a real doozy of a prior probability; no working will ever be shown)

Preaching "rationalism" as if it's a religion substitute.[26][27]

Taking offense when someone inevitably points out that they're preaching a religion substitute.

Attempting to argue their points with them resulting in being told "You should try reading the sequences" rather than them addressing your point.[28]

"Eliezer Yudkowsky suggests ..."

Comment author: pedanterrific 27 January 2012 08:39:29PM 8 points [-]

minds freshly blown

Band name alert.

Comment author: Karmakaiser 26 January 2012 10:39:27PM 1 point [-]

"Rational Approach..."

Comment author: [deleted] 23 March 2012 03:39:40AM 0 points [-]

"Stop dissecting my hypothetical..."

"Do you operate under Crockers rules?"

"That is a question about quality of life, not about death badness!"