All of waitingforgodel's Comments + Replies

Sorry to see this so heavily downvoted. Thanks -- this made for interesting reading and watching.

If you haven't checked out the archive of iq.org it's also a rather interesting blog :)

re: formatting... you don't happen to use Ubuntu/Chrome, do you?

7gwern
Assange writes some pretty insightful things. I was pretty struck by (in a Long Now way) this quote: --Julian Assange, "Tue 05 Dec 2006 : Self destructing paper" (Informed some of my own thoughts, anyway.)

He says that natural events are included in the category of journalism that's not about exposing other peoples secrets....

LOL, how did I miss this:

1) There is quite a bit of journalism that has nothing to do with exposing other peoples secrets. This would include reporting on natural events (storms, snow, earthquakes, politicians lying or accepting bribes).

Are you under the impression that a politician wouldn't consider his accepting bribes to be a secret?

4Alicorn
I think it was being classed as a "natural event".
  1. Wikileaks has published less than 1% of the diplomatic cables[1]. It goes thorough and removes sensitive and personal information before posting them online[2]. Except for a handful of exceptions, they only publish information that one of their newspaper partners has already published[2].

  2. In the US we don't say people are guilty until proven so -- Manning has made no public confession, and has not been tried. He's being held solely as the result of one man's (Adrian Lamo's) testimony, to the best of our knowledge[3]. That man was forcibly checked into a

... (read more)

What do you suppose Einstein would say about doing different things over and over and expecting the same result? :p

0[anonymous]
If the same result is "learned something about the world", where's the problem?
0orthonormal
Fair enough, but it worked out OK for the scientific method too...

Never trust anyone unless you're talking in person? :p

Yes. If I didn't none of this would make any sense...

It's interesting, but I don't see any similarly high-effectiveness ways to influence Peter Thiel... Republicans already want to do high x-risk things, Thiel doesn't already want to decrease funding.

After reviewing my copies of the deleted post, I can say that he doesn't say this explicitly. I was remembering another commenter who was trying to work out the implications on x-risk of having viewed the basilisk.

EY does say things that directly imply he thinks the post is a basilisk because of an x-risk increase, but he does not say what he thinks that increase is.

Edit: can't reply, no karma. It means I don't know if it's proportional.

0[anonymous]
I'm pretty sure that this is false.
0[anonymous]
I'm fairly certain this is false.
7wedrifid
Nod. That makes more sense. One thing that Eliezer takes care to avoid doing is giving his actual numbers regarding the existential possibilities. And that is an extremely wise decision. Not everyone has fully internalised the idea behind Shut Up and Do The Impossible! Even if Eliezer believed that all of the work he and the SIAI may do will only improve our existential expectation by the kind of tiny amount you mention it would most likely still be the right choice to go ahead and do exactly what he is trying to do. But not everyone is that good at multiplication.
3TheOtherDave
Does that mean you're backing away from your assertion of proportionality? Or just that you're using a different argument to support it?

At karma 0 I can't reply to each of you one at a time (rate limited - 10 min per post), so here are my replies in a single large comment:


@JoshuaZ

I would feel differently about nuke designs. As I said in the "why" links, I believe that EY has a bug when it comes to tail risks. This is an attempt to fix that bug.

Basically non-nuke censorship isn't necessary when you use a reddit engine... and Roko's post isn't a nuke.


@rwallace

Yes, though you'd have to say more.


@jaimeastorga2000

Incredible, thanks for the link


@shokwave

Incredible. Where were y... (read more)

-1shokwave
I was only as serious as you were :P
Manfred150

In this case my estimate is a 5% chance that EY wants to spread the censored material, and used censoring for publicity. Therefore spreading the censored material is questionable as a tactic.

Be careful to keep your eye on the ball. This isn't some zero-sum contest of wills, where if EY gets what he wants that's bad. The ball is human welfare, or should be.

Re #1: EY claimed his censorship caused something like 0.0001% risk reduction at the time, hence the amount chosen -- it is there to balance his motivation out.

Re #2: Letting Christians/Republicans know that they should be interested in passing a law is not the same as hostage taking or harming someone's family. I agree that narrow targeting is preferable.

Re #3 and #4: I have a right to tell Christians/Republicans about a law they're likely to feel should be passed -- it's a right granted to me by the country I live in. I can tell them about that law for w... (read more)

3wedrifid
Citation? That sounds like an insane thing for Eliezer to have said.
9HughRistik
I did read the original precommitment discussions. I thought your original threat was non-serious, and presented as an interesting thought experiment. I was with you on the subject of anti-censorship. When I discovered that your precommitment was serious, you lost the moral high-ground in my eyes, and entered territory where I will not follow.
rwallace140

If I observe that I did read the thread to which you refer, and I still think your current course of action is stupid and crazy (and that's coming from someone who agrees with you about the censorship in question being wrong!) will that change your opinion even slightly?

WrongBot130

Your math is wrong. It was always wrong, and it is even more wrong now that it is clear that you are failing to influence Eliezer's behavior (for which I am thankful).

7Jack
Why not share 'the Basilisk' with more people every time EY censors a post instead of raising existential risk?

You throw some scary ideas around. Try this one on for size. This post of yours has caused me to revise my probability of the proposition "the best solution to some irrational precommitments is murder" from Pascal's-wager levels (indescribably improbable) to 0.01%.

There are some people who agree with you (the best way to block legislation is to kill the people who come up with it).

I'd say that since I've only been talking about doing things well within my legal rights (using the legal system), that talking about murdering me is a bit "cultish"...

0shokwave
The expected value of murder in any case only comes out positive if there are more than 7,000 people on average at risk from the action - which will happen when there are 7 billion people on the planet and I am 100% convinced the actor is going to perform the action once, OR there are 6.5 billion people on the planet and I am 54% convinced the actor is going to perform the action twice, OR 6.5 billion, 36% sure of three actions... etc. I can't speak for the legal system but "one death for 6570 lives" vs "6570 deaths for one blog post" speaks for itself.

I actually explicitly said what oscar said in the discussion of the precommitment.

I also posted my reasoning for it.

Those are both from the "precommitted" link in my article.

9Lightwave
Not quite sure how to respond.. Do you really think you're completely out of options and you need to start acting in a way that increases existential risk with the purpose of reducing it, by attempting to blackmail a person who will very likely not respond to blackmail?
9Psychohistorian
No, it's not. You can't just pretend that the threat is trivial when it's not. "You'd hate gun control legislation" is not an appropriate comparison. The utility hit of nudging up the odds of something I'd hate happening is not directly comparable. Given the circumstances and EY's obvious beliefs, the negative utility value of an FAI is vastly worse. Comparable would be this: every time he sees me not wear a seatbelt, he rolls 8 dice. If they all come up sixes, he'd hunt down, torture, and murder everyone I know and love. The odds are actually slightly lower, and the negative payoff is vastly smaller in this example, so if anything it's an understatement (though failing to wear a seatbelt is a much less bad thing to do than censoring someone, so perhaps it balances). I think this is pretty clearly improper.

Also note that it wasn't when I submitted to the main site...

7Snowyowl
Good thing too. At the time of writing you'd have lost 110 points of karma for this post, instead of only 11.
0[anonymous]
Looking at your recent post, I think Alicorn had a good point.
6Eliezer Yudkowsky
Your post has been moved to the Discussion section, not deleted.

YES IT IS. In case anyone missed it. It isn't Roko's post we're talking about right now

4Roko
There is still a moral sense in which if, after careful thought, I decided that that material should not have been posted, then any posts which resulted solely from my post are in a sense a violation of my desire to not have posted it. Especially if said posts operate under the illusion that my original post was censored rather than retracted. But in reality such ideas tend to propagate like the imp of the perverse: a gnawing desire to know what the "censored" material is, even if everyone who knows what it is has subsequently decided that they wished they didn't! E.g both me and Nesov have been persuaded (once fully filled in) that this is really nasty stuff and shouldn't be let out. (correct me if I am wrong). This "imp of the perverse" property is actually part of the reason why the original post is harmful. In a sense, this is an idea-virus which makes people who don't yet have it want to have it, but as soon as they have been exposed to it, they (belatedly) realize they really didn't want to know about it or spread it. Sigh.

In this case, the comment censored was not posted by you. Therefore you're not the author.

FYI the actual author didn't even know it was censored.

Kutta110

I'd just like to insert a little tangent here: Roko's post and the select comments are the only things that moderation had any effect on whatsoever since the launch of the site - if I remember correctly. I don't think even the PUA wars had any interference from above. Of course, this is a community blog, but even this level of noninterference is very non-typical on the internet. Normally you'd join a forum and get a locked thread and a mod warning every so often.

Additionally, that on LW we get this much insight about the workings of SIAI-as-a-nonprofit and have this much space for discussion of any related topics is also an uncommon thing and should be considered a bonus.

Roko180

May I at this point point out that I agree that the post in question should not appear in public. Therefore, it is a question of the author's right to retract material, not of censorship.

6Bongo
Eh, if the stuff hinted at really exist, you should release it anyway. I expect the stuff is not really that bad and you'll hurt SIAI more with innuendo than with the stuff.

Are you joking? Do you have any idea what a retarded law can do to existential risks?

7jimrandomh
P(law will pass|it is retarded && its sole advocate publicly described it as retarded) << 10^-6
9David_Gerard
Even I think you're just being silly now. I really don't see how this helps refine the art of human rationality.
7wedrifid
It's not about being 'bad', dammit. Ask yourself what you want then ask yourself how to achieve it. Eliminate threats because they happen to be in your way, not out of spite.

Note that comments like these are still not being deleted, by the way. LW censors Langford Basilisks, not arbitrarily great levels of harmful stupidity or hostility toward the hosts - those are left to ordinary downvoting.

wedrifid280

If you feel more comfortable labeling it 'terrorism'... well... it's your thinking to bias.

No, the label is accurate. Right smack bang inside the concept of terrorism. And I am saying that as someone who agrees that Eliezer is behaving like a socially inept git.

someone has to stand up against your abuse

Why? Feeling obliged to fight against people just gives them power over you.

4jimrandomh
There is a big mismatch here between "sending an email to a blogger" and "increase existential risk by one in a million". All of the strategies for achieving existential risk increases that large are either major felonies, or require abusing a political office as leverage. When you first made the threat, I got angry at you on the assumption that you realized this. But if all you're threatening to do is send emails, well, I guess that's your right.
1[anonymous]
Don't vote this down under the default viewing threshold, please! Oh, and I'm reposting it here just in case WFG tries to delete it later: Ordinarily I'd consider that a violation of netiquette, but under these exact circumstances...
Roko260

Dude, don't be an idiot. Really.

(Shrugs.)

Your decision. The Singularity Institute does not negotiate with terrorists.

8wedrifid
Don't let Eliezer to provoke you like that. Obviously just reposting comments would be a waste of time and would just get more of the same. The legitimate purposes of your script are: * Ensure that you don't miss out on content. * Allow you to inform other interested people of said content (outside of the LessWrong system). * Make it possible for you to make observations along the lines of "there has been a comment censored here".

In other words, you have allegedly precommited to existential terrorism, killing the Future with small probability if your demands are not met.

Great post. It confuses me why this isn't at 10+ karma

7David_Gerard
+5 is fine! Y'know, one of the actual problems with LW is that I read it in my Internet as Television time, but there's a REALLY PROMINENT SCORE COUNTER at the top left. This does not help in not treating it as a winnable video game. (That said, could the people mass-downvoting waitingforgodel please stop? It's tiresome. Please try to go by comment, not poster.)
1[anonymous]
Probably because its buried in the middle of an enormous discussion that very few people have read and will read.
2lessdazed
Your comment here killed the hostage.

An example of this would be errors or misconduct in completing past projects.

When I asked Anna about the coordination between SIAI and FHI, something like "Do you talk enough with each other that you wouldn't both spend resources writing the same research paper?", she was told me about the one time that they had in fact both presented a paper on the same topic at a conference, and that they do now coordinate more to prevent that sort of thing.

I have found that Anna and others at SIAI are honest and forthcoming.

[anonymous]130

You're trying very hard to get everyone to think that SIAI has lied to donors or done something equally dishonest. I agree that this is an appropriate question to discuss, but you are pursuing the matter so aggressively that I just have to ask: do you know something we don't? Do you think that you/other donors have been lied to on a particular occasion, and if so, when?

why shouldn't they shut up?

Because this is LessWrong -- you can give a sane response and not only does it clear the air, people understand and appreciate it.

Cable news debating isn't needed here.

Sure we might still wonder if they're being perfectly honest, but saying something more sane on the topic than silence seems like a net-positive from their perspective.

3wedrifid
By way of a reminder, the question under discussion was:
1wnoise
LessWrongers are not magically free of bias. Nor are they inherently moral people that wouldn't stoop to using misleading rhetorical techniques, though here they are more likely to be called on it. In any case, an answer here is available to the public internet for all to see.

no sensible person who had the answer would

I respectfully disagree, and have my hopes set on Carl (or some other level-headed person in a position to know) giving a satisfying answer.

This is LessWrong after all -- we can follow complicated arguments, and at least hearing how SIAI is actually thinking about such things would (probably) reduce my paranoia.

1David_Gerard
Yeah, but this is on the Internet for everyone to see. The potential for political abuse is ridiculous and can infect even LessWrong readers. Politics is the mind-killer, but pretending it doesn't affect almost everyone else strikes me as not smart.

Make that "they do it for the greater good"

Sorry about mistakingly implying s/he was affiliated. I'll be more diligent with my google stalking in the future.

edit: In my defense, SIAI affiliation has been very common when looking up very "pro" people from this thread

3AnnaSalamon
Thanks. I appreciate that.

but he won me back by answering anyway <3

This sounds very sane, and makes me feel a lot better about the context. Thank you very much.

I very much like the idea that top SIAI people believe that there is such a thing as too much devotion to the cause (and, I'm assuming, actively talk people who are above that level down as you describe doing for Roko).

As someone who has demonstrated impressive sanity around these topics, you seem to be in a unique position to answer these questions with an above-average level-headedness:

  1. Do you understand the math behind the Roko post deletion?

  2. What do you think about the Roko post deletion?

  3. What do you think about future deletions?

Do you understand the math behind the Roko post deletion?

Yes, his post was based on (garbled versions of) some work I had been doing at FHI, which I had talked about with him while trying to figure out some knotty sub-problems.

What do you think about the Roko post deletion?

I think the intent behind it was benign, at least in that Eliezer had his views about the issue (which is more general, and not about screwed-up FAI attempts) previously, and that he was motivated to prevent harm to people hearing the idea and others generally. Indeed, he was expl... (read more)

Am I missing something? Desrtopa responded to questions of lying to the donor pool with the equivalent of "We do it for the greater good"

9AnnaSalamon
Desrtopa isn't affiliated with SIAI. You seem to be deliberately designing confusing comments, a la Glenn Beck's "I'm just asking questions" motif.

That "confessor" link is terrific

If banning Roko's post would reasonably cause discussion of those ideas to move away from LessWrong, then by EY's own reasoning (the link you gave) it seems like a retarded move.

Right?

Bongo100

If the idea is actually dangerous, it's way less dangerous to people who aren't familiar with pretty esoteric Lesswrongian ideas. They're prerequisites to being vulnerable to it. So getting conversation about the idea away from Lesswrong isn't an obviously retarded idea.

accusations stick in the mind even when one is explicitly told they are false

Actually that citation is about both positive and negative things -- so unless you're also asking pro-SIAI people to hush up, you're (perhaps unknowingly) seeking to cause a pro-SIAI bias.

Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.

One of the interesting morals from Roko's contest is that if you care deeply about getting the most benefit ... (read more)

To answer your question, despite David Gerard's advice:

I would not lie to donors about the likely impact of their donations, the evidence concerning SIAI's ability or inability to pull off projects, how we compare to other organizations aimed at existential risk reduction, etc. (I don't have all the answers, but I aim for accuracy and revise my beliefs and my statements as evidence comes in; I've actively tried to gather info on whether we or FHI reduce risk more per dollar, and I often recommend to donors that they do their own legwork with that charity ... (read more)

Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.

Well, uh, yeah. The horse has bolted. It's entirely unclear what choosing to keep one's head in the sand gains anyone.

What would SIAI be willing to lie to donors about?

Although this is a reasonable question to want the answer to, it's obvious even to me that answering at all would be silly and no sensible person who had the answer would.

Investigating the logic or lac... (read more)

Okay, you can leave it abstract. Here's what I was hoping to have explained: why were you discussing what people would really be prepared to sacrifice?

... and not just the surface level of "just for fun," but also considering how these "just for fun" games get started, and what they do to enforce cohesion in a group.

1Roko
The context was the distinction between signalling-related speech acts and real values.
0[anonymous]
Did you see Carl Shulman's explanation?
4David_Gerard
Big +1. Every cause wants to be a cult. Every individual (or, realistically, as many as possible) must know how to resist this for a group with big goals not to go off the rails.

Ahh. I was trying to ask about Cialdini-style influence techniques.

I think the question you should be asking is less about evil conspiracies, and more about what kind of organization SIAI is -- what would they tell you about, and what would they lie to you about.

4XiXiDu
If the forbidden topic would be made public (and people would believe it), it would result in a steep rise of donations towards the SIAI. That alone is enough to conclude that the SIAI is not trying to hold back something that would discredit it as an organisation concerned with charitable objectives. The censoring of the information was in accordance with their goal of trying to prevent unfriendly artificial intelligence. Making the subject matter public did already harm some people and could harm people in future.

I agree that there's a lot in history, but the examples you cited have something that doesn't match here -- historically, you lie to people you don't plan on cooperating with later.

If you lie to an oppressive government, it's okay because it'll either get overthrown or you'll never want to cooperate with it (so great is your reason for lying).

Lying to your donor pool is very, very different than lying to the Nazis about hiding jews.

8Bongo
You're throwing around accusations of lying pretty lightly.

As you may know from your study of marketing, accusations stick in the mind even when one is explicitly told they are false. In the parent comment and a sibling, you describe a hypothetical SIAI lying to its donors because... Roko had some conversations with Carl that led you to believe we care strongly about existential risk reduction?

If your aim is to improve SIAI, to cause there to be good organizations in this space, and/or to cause Less Wrong-ers to have accurate info, you might consider:

  1. Talking with SIAI and/or with Fellows program alumni, so as
... (read more)
9Bongo
The concept of ethical injunctions is known in SIAI circles I think. Enduring personal harm for your cause and doing unethical things for your cause are therefore different. Consider Eliezer's speculation about whether a rationalist "confessor" should ever lie in this post, too. And these personal struggles with whether to ever lie about SIAI's work.
1Desrtopa
Lying for good causes has a time honored history. Protecting fugitive slaves or holocaust victims immediately comes to mind. Just because it is more often practical to be honest than not doesn't mean that dishonesty isn't sometimes unambiguously the better option.

First off, great comment -- interesting, and complex.

But, some things still don't make sense to me...

Assuming that what you described led to:

I was once criticized by a senior singinst member for not being prepared to be tortured or raped for the cause. I mean not actually, but, you know, in theory. Precommiting to being prepared to make a sacrifice that big. shrugs

  1. How did precommitting enter in to it?

  2. Are you prepared to be tortured or raped for the cause? Have you precommitted to it?

  3. Have other SIAI people you know of talked about this with you, ha

... (read more)

I find this whole line of conversation fairly ludicrous, but here goes:

Number 1. Time-inconsistency: we have different reactions about an immediate certainty of some bad than a future probability of it. So many people might be willing to go be a health worker in a poor country where aid workers are commonly (1 in 10,000) raped or killed, even though they would not be willing to be certainly attacked in exchange for 10,000 times the benefits to others. In the actual instant of being tortured anyone would break, but people do choose courses of action that ca... (read more)

Load More