EStokes comments on Should I believe what the SIAI claims? - Less Wrong

23 Post author: XiXiDu 12 August 2010 02:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (600)

You are viewing a single comment's thread.

Comment author: EStokes 12 August 2010 11:04:58PM 4 points [-]

I don't think this post was well-written, at the least. I didn't even understand the tl;dr?

tldr; Is the SIAI evidence-based or merely following a certain philosophy? I'm currently unable to judge if the Less Wrong community and the SIAI are updating on fictional evidence or if the propositions, i.e. the basis for the strong arguments for action that are proclaimed on this site, are based on fact.

I don't see much precise expansion on this, except for MWI? There's a sequence on it.

And that is my problem. Given my current educational background and knowledge I cannot differentiate LW between a consistent internal logic, i.e. imagination or fiction and something which is sufficiently based on empirical criticism to provide a firm substantiation of the strong arguments for action that are proclaimed on this site.

Have you read the sequences?

As for why there aren't more people supporting SIAI, first of all, it's not widely known, second of all, it's liable to be dismissed on first impressions. Not many have examined the SIAI. Also, only (http://en.wikipedia.org/wiki/Religion#cite_ref-49)[4% of the general public in the US believe in neither a god nor a higher power]. The majority isn't always right.

I don't understand why this post has upvotes. It was unclear and seems topics went unresearched. The usefulness of donating to the SIAI has been discussed before, I think someone probably would've posted a link if asked in the open thread.

Comment author: kodos96 13 August 2010 05:18:28AM 12 points [-]

I don't understand why this post has upvotes.

I think the obvious answer to this is that there are a significant number of people out there, even out there in the LW community, who share XiXiDu's doubts about some of SIAIs premises and conclusions, but perhaps don't speak up with their concerns either because a) they don't know quite how to put them into words, or b) they are afraid of being ridiculed/looked down on.

Unfortunately, the tone of a lot of the responses to this thread lead me to believe that those motivated by the latter option may have been right to worry.

Comment author: Furcas 13 August 2010 05:23:20AM 7 points [-]

Personally, I upvoted the OP because I wanted to help motivate Eliezer to reply to it. I don't actually think it's any good.

Comment author: kodos96 13 August 2010 05:47:44AM *  11 points [-]

Yeah, I agree (no offense XiXiDu) that it probably could have been better written, cited more specific objections etc. But the core sentiment is one that I think a lot of people share, and so it's therefore an important discussion to have. That's why it's so disappointing that Eliezer seems to have responded with such an uncharacteristically thin skin, and basically resorted to calling people stupid (sorry, "low g-factor") if they have trouble swallowing certain parts of the SIAI position.

Comment author: HughRistik 13 August 2010 06:52:21PM 2 points [-]

This was exactly my impression, also.

Comment author: Wei_Dai 13 August 2010 08:51:30AM 5 points [-]

I think your upvote probably backfired, because (I'm guessing) Eliezer got frustrated that such a badly written post got upvoted so quickly (implying that his efforts to build a rationalist community were less successful than he had thought/hoped) and therefore responded with less patience than he otherwise might have.

Comment author: Eliezer_Yudkowsky 13 August 2010 07:30:48PM *  1 point [-]

Then you should have written your own version of it. Bad posts that get upvoted just annoy me on a visceral level and make me think that explaining things is hopeless, if LWers still think that bad posts deserve upvotes. People like XiXiDu are ones I've learned to classify as noisemakers who suck up lots of attention but who never actually change their minds enough to start pitching in, no matter how much you argue with them. My perceptual system claims to be able to classify pretty quickly whether someone is really trying or not, and I have no concrete reason to doubt it.

I guess next time I'll try to remember not to reply at all.

Everyone else, please stop upvoting posts that aren't good. If you're interested in the topic, write your own version of the question.

Comment author: orthonormal 13 August 2010 10:43:31PM *  8 points [-]

It has seemed to me for a while that a number of people will upvote any post that goes against the LW 'consensus' position on cryonics/Singularity/Friendliness, so long as it's not laughably badly written.

I don't think anything Eliezer can say will change that trend, for obvious reasons.

However, most of us could do better in downvoting badly argued or fatally flawed posts. It amazes me that many of the worst posts here won't drop below 0 for any stated amount of time, and even then not very far. Docking someone's karma isn't going to kill them, folks. Do everyone a favor and use those downvotes.

Comment author: XiXiDu 14 August 2010 09:14:05AM 4 points [-]

...badly argued or fatally flawed posts.

My post is neither badly argued nor fatally flawed as I've mainly been asking questions and not making arguments. But if you think otherwise, why don't you argue where I am fatally flawed?

My post has not been written to speak out against any 'consensus', I agree with the primary conclusions but am skeptic about further chains of reasoning based on those conclusions as I don't perceive them to be based on firm ground but merely be what follows from previous evidence.

And yes, I'm a lazy bum. I've not thought about the OP for more than 10 minutes. It's actually copy and paste work from previous comments. Hell, what have you expected? A dissertation? Nobody else was asking those questions, someone had to.

Comment author: XiXiDu 13 August 2010 07:36:02PM 13 points [-]

What are you considering as pitching in? That I'm donating as I am, or that I am promoting you, LW and the SIAI all over the web, as I am doing?

You simply seem to take my post as hostile attack rather than the inquiring of someone who happened not to be lucky enough to get a decent education in time.

Comment author: HughRistik 13 August 2010 07:45:55PM *  6 points [-]

Eliezer seems to have run your post through some crude heuristic and incorrectly categorized it. While you did make certain errors that many people have observed, I think you deserved a different response.

At least, Eliezer seemingly not realizing that you are a donor means that his treatment of you doesn't represent how he treats donors.

Edit: To his credit, Eliezer apologized and admitted to his perceptual misclassification.

Comment author: Eliezer_Yudkowsky 13 August 2010 07:45:20PM *  11 points [-]

All right, I'll note that my perceptual system misclassified you completely and consider that concrete reason to doubt it from now on.

Sorry.

If you are writing a post like that one it is really important to tell me that you are an SIAI donor. It gets a lot more consideration if I know that I'm dealing with "the sort of thing said by someone who actually helps" and not "the sort of thing said by someone who wants an excuse to stay on the sidelines, and who will just find another excuse after you reply to them", which is how my perceptual system classified that post.

The Summit is coming up and I've got lots of stuff to do right at this minute, but I'll top-comment my very quick attempt at pointing to information sources for replies.

Comment author: XiXiDu 13 August 2010 07:55:32PM 6 points [-]

I'll donate again in the next few days and tell you what name and the amount. I don't have much, but so that you see that I'm not just making this up. Maybe you can also check the previous donation then.

And for the promoting, everyone can Google it. I link people up to your stuff almost every day. And there are people here who added me to Facebook and if you check my info you'll see that some of my favorite quotations are actually yours.

And how come that on my homepage, if you check the sidebar, your homepage and the SIAI are listed under favorite sites, for many years now?

I'm the kind of person who has to be skeptic about everything and if I'm bothered too much by questions I cannot resolve in time I do stupid things. Maybe this post was stupid, I don't know.

Comment author: Aleksei_Riikonen 14 August 2010 01:46:04AM 4 points [-]

Sorry about this sounding impolite towards XiXiDu, but I'll use this opportunity to note that it is a significant problem for SIAI, that there are people out there like XiXiDu promoting SIAI even though they don't understand SIAI much at all.

I don't know what's the best attitude to try to minimize the problem this creates, that many people will first run into SIAI through hearing about it from people who don't seem very clueful or intelligent. (That's real bayesian evidence for SIAI being a cult or just crazy, and many people then won't acquire sufficient additional evidence to update out of the misleading first impression -- not to mention that the biased way of getting stuck in first impressions is very common also.)

Personally, I've adopted the habit of not even trying to talk about singularity stuff to new people who aren't very bright. (Of course, if they become interested despite this, then they can't just be completely ignored.)

Comment author: XiXiDu 14 August 2010 09:02:28AM 6 points [-]

I thought about that too. But many people outside this community suspect me, as they often state, to be intelligent and educated. And I mainly try to talk to people in the academics. You won't believe that even I am able to make them think that I'm one of them, up to the point of correcting errors in their calculations (it happened). Many haven't even heard about Bayesian inference by the way...

The way I introduce people to this is not by telling them about the risks of AGI but rather linking them up to specific articles on lesswrong.com or telling them about how the SIAI tries to develop ethical decision making etc.

I've grown up in a family of Jehovah's Witnesses, I know how to start selling bullshit. Not that the SIAI is bullshit, but I'd never use words like 'Singularity' while promoting it to people I don't know.

Many people know about the transhumanist/singularity fraction already and think it is complete nonsense, so I often can only improve their opinion.

There are people teaching on university level that told me I convinced them that he (EY) is to be taken seriously.

Comment author: Aleksei_Riikonen 14 August 2010 02:56:41PM 1 point [-]

What you state is good evidence that you are not one of those too stupid people I was talking about (even though you have managed to not understand what SIAI is saying very well). Thanks for presenting the evidence, and correcting my suspicion that someone on your level of non-comprehension would usually end up doing more harm than good.

Comment author: xamdam 13 August 2010 07:52:21PM *  9 points [-]

It was actually in the post

What I mean to say by using that idiom is that I cannot expect, given my current knowledge, to get the promised utility payoff that would justify to make the SIAI a prime priority. That is, I'm donating to the SIAI but also spend considerable amounts of resources maximizing utility at present.

So you might suggest to your perceptual system to read the post first (at least before issuing a strong reply).

Comment author: Clippy 13 August 2010 07:55:37PM 5 points [-]

I also donated to SIAI, and it was almost all the USD I had at the time, so I hope posters here take my questions seriously. (I would donate even more if someone would just tell me how to make USD.)

Also, I don't like when this internet website is overloaded with noise posts that don't accomplish anything.

Comment author: thomblake 13 August 2010 07:59:22PM 9 points [-]

Clippy, you represent a concept that is often used to demonstrate what a true enemy of goodness in the universe would look like, and you've managed to accrue 890 karma. I think you've gotten a remarkably good reception so far.

Comment author: xamdam 13 August 2010 08:04:15PM 5 points [-]

I think we have different ideas of noise

Though I would miss you as the LW mascot if you stopped adding this noise.

Comment author: CronoDAS 14 August 2010 09:55:16AM 3 points [-]

I would donate even more if someone would just tell me how to make USD.

Depending on your expertise and assets, this site might provide some ways.

Comment author: NancyLebovitz 14 August 2010 10:07:28AM 7 points [-]

I'm pretty sure Clippy meant "make" in a very literal sense.

Comment author: Clippy 14 August 2010 03:47:48PM 5 points [-]

Yeah, I want to know how to either produce the notes that will be recognized as USD, or access the financial system in a way that I can believably tell it that I own a certain amount of USD. The latter method could involve root access to financial institutions.

All the other methods of getting USD are disproportionately hard (_/

Comment author: Furcas 13 August 2010 10:25:52PM 1 point [-]

Then you should have written your own version of it.

I find it difficult to write stuff I don't believe.

Bad posts that get upvoted just annoy me on a visceral level and make me think that explaining things is hopeless, if LWers still think that bad posts deserve upvotes.

Noted.

Comment author: Interpolate 14 August 2010 03:58:50AM *  4 points [-]

I upvoted the original post for:

  • Stimulating critical discussion of the Less Wrong community - specifically: the beliefs almost unanimously shared, and the negativity towards criticsm; as someone who has found Less Wrong extremely helpful, and would hate to see it descend into groupthink and affiliation signalling.

A question to those who dismiss the OP as merely "noise": what do you make of the nature of this post?

  • Stimulating critical discussion of the operating premises of the SIAI; as someone who is considering donating and otherwise contributing. This additionally provides elucidation to those in a state of epistemic limbo regarding the various aspects of FAI and the Singularity.

I am reminded of this passage regarding online communities (source):

So there's this very complicated moment of a group coming together, where enough individuals, for whatever reason, sort of agree that something worthwhile is happening, and the decision they make at that moment is: This is good and must be protected. And at that moment, even if it's subconscious, you start getting group effects. And the effects that we've seen come up over and over and over again in online communities...

The first is sex talk, what he called, in his mid-century prose, "A group met for pairing off." And what that means is, the group conceives of its purpose as the hosting of flirtatious or salacious talk or emotions passing between pairs of members...

The second basic pattern that Bion detailed: The identification and vilification of external enemies. This is a very common pattern. Anyone who was around the Open Source movement in the mid-Nineties could see this all the time...

The third pattern Bion identified: Religious veneration. The nomination and worship of a religious icon or a set of religious tenets. The religious pattern is, essentially, we have nominated something that's beyond critique. You can see this pattern on the Internet any day you like...

So these are human patterns that have shown up on the Internet, not because of the software, but because it's being used by humans. Bion has identified this possibility of groups sandbagging their sophisticated goals with these basic urges. And what he finally came to, in analyzing this tension, is that group structure is necessary. Robert's Rules of Order are necessary. Constitutions are necessary. Norms, rituals, laws, the whole list of ways that we say, out of the universe of possible behaviors, we're going to draw a relatively small circle around the acceptable ones.

He said the group structure is necessary to defend the group from itself. Group structure exists to keep a group on target, on track, on message, on charter, whatever. To keep a group focused on its own sophisticated goals and to keep a group from sliding into these basic patterns. Group structure defends the group from the action of its own members.

Comment author: Aleksei_Riikonen 14 August 2010 04:06:27AM *  1 point [-]

As someone who thought the OP was of poor quality, and who has had a very high opinion of SIAI and EY for a long time (and still has), I'll say that that "Eliezer Yudkowsky facts" was indeed a lot worse. It was the most embarrassing thing I've ever read on this site. Most of those jokes aren't even good.

Comment author: simplicio 14 August 2010 07:56:00AM 7 points [-]

They are very good examples of the genre (Chuck Norris-style jokes). I for one could not contain my levity.

Comment author: Liron 14 August 2010 11:49:53PM 6 points [-]

Fact: Evaluating humor about Eliezer Yudkowsky always results in an interplay between levels of meta-humor such that the analysis itself is funny precisely when the original joke isn't.

Comment author: XiXiDu 14 August 2010 09:26:20AM 5 points [-]

Wow, I thought it was one of the best. By that post I actually introduced a philosopher (who teaches in Sweden), who's been skeptic about EY, to read up on the MWI sequence and afterwards agree that EY is right.

Comment author: ciphergoth 14 August 2010 07:57:50AM 4 points [-]

I like that post - of course, few of the jokes are funny, but you read such a thing for the few gems they do contain. I think of it as hanging a lampshade (warning, TV tropes) on one of the problems with this website.

Comment author: Eliezer_Yudkowsky 14 August 2010 07:28:28AM 7 points [-]

I was embarrassed by most of the facts. The one about my holding up a blank sheet of paper and saying "a blank map does not correspond to a blank territory" and thus creating the universe is one I still tell at parties.

Comment deleted 14 August 2010 09:34:47AM *  [-]
Comment author: Aleksei_Riikonen 14 August 2010 02:48:36PM 1 point [-]

What, why are you talking about a hostile attack?

Of course I didn't feel that it would be that. It's quite the opposite, it felt to me like communicating an unhealthy air of hero worship.

Comment author: XiXiDu 14 August 2010 03:38:48PM 3 points [-]

Then I have been the one to completely misinterpret what you said. Apologize, I'm not good at this.

I've said it before the OP but failed miserably:

I should quit now and for some time stop participating on LW. I have to continue with my studies. I was only drawn here by the deletion incident. Replies and that it is fun to to argue have made me babble too much in the past few days.

Back to being lurker. Thanks.

Comment author: Wei_Dai 14 August 2010 05:16:33AM 9 points [-]

"Eliezer Yudkowsky facts" is meant to be fun and entertainment. Do you agree that there is a large subjective component to what a person will think is fun, and that different people will be amused by different types of jokes? Obviously many people did find the post amusing (judging from its 47 votes), even if you didn't. If those jokes were not posted, then something of real value would have been lost.

The situation with XiXiDu's post's is different because almost everyone seems to agree that it's bad, and those who voted it up did so only to "stimulate discussion". But if they didn't vote up XiXiDu's post, it's quite likely that someone would eventually write up a better post asking similar questions and generating a higher quality discussion, so the outcome would likely be a net improvement. Or alternatively, those who wanted to "stimulate discussion" could have just looked in the LW archives and found all the discussion they could ever hope for.

Comment author: XiXiDu 14 August 2010 02:01:09PM *  1 point [-]

If almost everyone thought it's bad I would expect it to have much more downvotes than upvotes, even given the few people who voted it up to "stimulate discussion". But you probably know more about statistics than I do, so never mind.

...it's quite likely that someone would eventually write up a better post asking similar questions.

Before or after the SIAI build a FAI? I waited half a decade for any of those questions to be asked in the first place.

Or alternatively, those who wanted to "stimulate discussion" could have just looked in the LW archives and found all the discussion they could ever hope for.

Right, haven't thought about that! I'll be right back reading a few thousand comments to find some transparency.

Comment author: Risto_Saarelma 14 August 2010 10:31:36AM 0 points [-]

Do you agree that there is a large subjective component to what a person will think is fun, and that different people will be amused by different types of jokes?

This is true. You might also be able to think of jokes that aren't worth making even though a group of people would find then genuinely funny.

I agree with Aleksei about the Facts article.

Comment author: Wei_Dai 14 August 2010 10:40:53AM 4 points [-]

Can you please explain why you think those jokes shouldn't have been made? I thought that making fun of authority figures is socially accepted in general, and in this case shows that we don't take Eliezer too seriously. Do you disagree?

Comment author: Risto_Saarelma 14 August 2010 11:07:42AM 0 points [-]

Making him the subject of a list like that looks plenty serious to me.

Beyond that, I don't think there's much that I can say. There's a certain tone-deafness that's rubbing me wrong in both the post and in this discussion, but exactly how that works is not something that I know how to convey with a couple of paragraphs of text.

Comment author: NancyLebovitz 14 August 2010 02:12:45PM *  5 points [-]

I have a theory: all the jokes parse out to "Eliezer is brilliant, and we have a bunch of esoteric in-jokes to show how smart we are". This isn't making fun of an authority figure.

This doesn't mean the article was a bad idea, or that I didn't think it was funny. I also don't think it's strong evidence that LW and SIAI aren't cults.

ETA: XiXiDu's comment that this is the community making fun of itself seems correct.

Comment author: Wei_Dai 14 August 2010 01:41:36PM 6 points [-]

Ok, I think I have an explanation for what's going on here. Those of us "old hands" who went through the period where LW was OB, and Eliezer and Robin were the only main posters, saw Eliezer as initially having very high status, and considered the "facts" post as a fun way of taking him down a notch or two. Newcomers who arrived after LW became a community blog, on the other hand, don't have the initial high status in mind, and instead see that post as itself assigning Eliezer a very high status, which they see as unjustified/weird/embarrassing. Makes sense, right?

(Voted parent up from -1, btw. That kind of report seems useful, even if the commenter couldn't explain why he felt that way.)

Comment deleted 14 August 2010 02:05:00PM [-]
Comment author: Vladimir_Nesov 14 August 2010 02:14:37PM *  3 points [-]

You seemed to seriously imply that Eliezer didn't understand that the "facts" thread was a joke, while actually he was sarcastically joking by hinting at not getting the joke in the comment you replied to. I downvoted the comment to punish stupidity on LW (nothing personal, believe it or not, in other words it's a one-step decision based on the comment alone and not on impression made by your other comments). Wei didn't talk about that.

Comment author: XiXiDu 14 August 2010 02:23:41PM 1 point [-]

I guess after so many comments implying things I never meant to say I was a bit aggrieved. Never mind.