He says that natural events are included in the category of journalism that's not about exposing other peoples secrets....
LOL, how did I miss this:
1) There is quite a bit of journalism that has nothing to do with exposing other peoples secrets. This would include reporting on natural events (storms, snow, earthquakes, politicians lying or accepting bribes).
Are you under the impression that a politician wouldn't consider his accepting bribes to be a secret?
Wikileaks has published less than 1% of the diplomatic cables[1]. It goes thorough and removes sensitive and personal information before posting them online[2]. Except for a handful of exceptions, they only publish information that one of their newspaper partners has already published[2].
In the US we don't say people are guilty until proven so -- Manning has made no public confession, and has not been tried. He's being held solely as the result of one man's (Adrian Lamo's) testimony, to the best of our knowledge[3]. That man was forcibly checked into a
What do you suppose Einstein would say about doing different things over and over and expecting the same result? :p
Never trust anyone unless you're talking in person? :p
Yes. If I didn't none of this would make any sense...
It's interesting, but I don't see any similarly high-effectiveness ways to influence Peter Thiel... Republicans already want to do high x-risk things, Thiel doesn't already want to decrease funding.
After reviewing my copies of the deleted post, I can say that he doesn't say this explicitly. I was remembering another commenter who was trying to work out the implications on x-risk of having viewed the basilisk.
EY does say things that directly imply he thinks the post is a basilisk because of an x-risk increase, but he does not say what he thinks that increase is.
Edit: can't reply, no karma. It means I don't know if it's proportional.
At karma 0 I can't reply to each of you one at a time (rate limited - 10 min per post), so here are my replies in a single large comment:
I would feel differently about nuke designs. As I said in the "why" links, I believe that EY has a bug when it comes to tail risks. This is an attempt to fix that bug.
Basically non-nuke censorship isn't necessary when you use a reddit engine... and Roko's post isn't a nuke.
Yes, though you'd have to say more.
Incredible, thanks for the link
Incredible. Where were y...
In this case my estimate is a 5% chance that EY wants to spread the censored material, and used censoring for publicity. Therefore spreading the censored material is questionable as a tactic.
Be careful to keep your eye on the ball. This isn't some zero-sum contest of wills, where if EY gets what he wants that's bad. The ball is human welfare, or should be.
Re #1: EY claimed his censorship caused something like 0.0001% risk reduction at the time, hence the amount chosen -- it is there to balance his motivation out.
Re #2: Letting Christians/Republicans know that they should be interested in passing a law is not the same as hostage taking or harming someone's family. I agree that narrow targeting is preferable.
Re #3 and #4: I have a right to tell Christians/Republicans about a law they're likely to feel should be passed -- it's a right granted to me by the country I live in. I can tell them about that law for w...
If I observe that I did read the thread to which you refer, and I still think your current course of action is stupid and crazy (and that's coming from someone who agrees with you about the censorship in question being wrong!) will that change your opinion even slightly?
Your math is wrong. It was always wrong, and it is even more wrong now that it is clear that you are failing to influence Eliezer's behavior (for which I am thankful).
You throw some scary ideas around. Try this one on for size. This post of yours has caused me to revise my probability of the proposition "the best solution to some irrational precommitments is murder" from Pascal's-wager levels (indescribably improbable) to 0.01%.
There are some people who agree with you (the best way to block legislation is to kill the people who come up with it).
I'd say that since I've only been talking about doing things well within my legal rights (using the legal system), that talking about murdering me is a bit "cultish"...
I actually explicitly said what oscar said in the discussion of the precommitment.
I also posted my reasoning for it.
Those are both from the "precommitted" link in my article.
Also note that it wasn't when I submitted to the main site...
I should have taken this bet
YES IT IS. In case anyone missed it. It isn't Roko's post we're talking about right now
In this case, the comment censored was not posted by you. Therefore you're not the author.
FYI the actual author didn't even know it was censored.
I'd just like to insert a little tangent here: Roko's post and the select comments are the only things that moderation had any effect on whatsoever since the launch of the site - if I remember correctly. I don't think even the PUA wars had any interference from above. Of course, this is a community blog, but even this level of noninterference is very non-typical on the internet. Normally you'd join a forum and get a locked thread and a mod warning every so often.
Additionally, that on LW we get this much insight about the workings of SIAI-as-a-nonprofit and have this much space for discussion of any related topics is also an uncommon thing and should be considered a bonus.
May I at this point point out that I agree that the post in question should not appear in public. Therefore, it is a question of the author's right to retract material, not of censorship.
Are you joking? Do you have any idea what a retarded law can do to existential risks?
Note that comments like these are still not being deleted, by the way. LW censors Langford Basilisks, not arbitrarily great levels of harmful stupidity or hostility toward the hosts - those are left to ordinary downvoting.
If you feel more comfortable labeling it 'terrorism'... well... it's your thinking to bias.
No, the label is accurate. Right smack bang inside the concept of terrorism. And I am saying that as someone who agrees that Eliezer is behaving like a socially inept git.
someone has to stand up against your abuse
Why? Feeling obliged to fight against people just gives them power over you.
Dude, don't be an idiot. Really.
(Shrugs.)
Your decision. The Singularity Institute does not negotiate with terrorists.
In other words, you have allegedly precommited to existential terrorism, killing the Future with small probability if your demands are not met.
Great post. It confuses me why this isn't at 10+ karma
An example of this would be errors or misconduct in completing past projects.
When I asked Anna about the coordination between SIAI and FHI, something like "Do you talk enough with each other that you wouldn't both spend resources writing the same research paper?", she was told me about the one time that they had in fact both presented a paper on the same topic at a conference, and that they do now coordinate more to prevent that sort of thing.
I have found that Anna and others at SIAI are honest and forthcoming.
You're trying very hard to get everyone to think that SIAI has lied to donors or done something equally dishonest. I agree that this is an appropriate question to discuss, but you are pursuing the matter so aggressively that I just have to ask: do you know something we don't? Do you think that you/other donors have been lied to on a particular occasion, and if so, when?
why shouldn't they shut up?
Because this is LessWrong -- you can give a sane response and not only does it clear the air, people understand and appreciate it.
Cable news debating isn't needed here.
Sure we might still wonder if they're being perfectly honest, but saying something more sane on the topic than silence seems like a net-positive from their perspective.
no sensible person who had the answer would
I respectfully disagree, and have my hopes set on Carl (or some other level-headed person in a position to know) giving a satisfying answer.
This is LessWrong after all -- we can follow complicated arguments, and at least hearing how SIAI is actually thinking about such things would (probably) reduce my paranoia.
Make that "they do it for the greater good"
Sorry about mistakingly implying s/he was affiliated. I'll be more diligent with my google stalking in the future.
edit: In my defense, SIAI affiliation has been very common when looking up very "pro" people from this thread
but he won me back by answering anyway <3
This sounds very sane, and makes me feel a lot better about the context. Thank you very much.
I very much like the idea that top SIAI people believe that there is such a thing as too much devotion to the cause (and, I'm assuming, actively talk people who are above that level down as you describe doing for Roko).
As someone who has demonstrated impressive sanity around these topics, you seem to be in a unique position to answer these questions with an above-average level-headedness:
Do you understand the math behind the Roko post deletion?
What do you think about the Roko post deletion?
What do you think about future deletions?
Do you understand the math behind the Roko post deletion?
Yes, his post was based on (garbled versions of) some work I had been doing at FHI, which I had talked about with him while trying to figure out some knotty sub-problems.
What do you think about the Roko post deletion?
I think the intent behind it was benign, at least in that Eliezer had his views about the issue (which is more general, and not about screwed-up FAI attempts) previously, and that he was motivated to prevent harm to people hearing the idea and others generally. Indeed, he was expl...
Am I missing something? Desrtopa responded to questions of lying to the donor pool with the equivalent of "We do it for the greater good"
That "confessor" link is terrific
If banning Roko's post would reasonably cause discussion of those ideas to move away from LessWrong, then by EY's own reasoning (the link you gave) it seems like a retarded move.
Right?
If the idea is actually dangerous, it's way less dangerous to people who aren't familiar with pretty esoteric Lesswrongian ideas. They're prerequisites to being vulnerable to it. So getting conversation about the idea away from Lesswrong isn't an obviously retarded idea.
accusations stick in the mind even when one is explicitly told they are false
Actually that citation is about both positive and negative things -- so unless you're also asking pro-SIAI people to hush up, you're (perhaps unknowingly) seeking to cause a pro-SIAI bias.
Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.
One of the interesting morals from Roko's contest is that if you care deeply about getting the most benefit ...
To answer your question, despite David Gerard's advice:
I would not lie to donors about the likely impact of their donations, the evidence concerning SIAI's ability or inability to pull off projects, how we compare to other organizations aimed at existential risk reduction, etc. (I don't have all the answers, but I aim for accuracy and revise my beliefs and my statements as evidence comes in; I've actively tried to gather info on whether we or FHI reduce risk more per dollar, and I often recommend to donors that they do their own legwork with that charity ...
Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.
Well, uh, yeah. The horse has bolted. It's entirely unclear what choosing to keep one's head in the sand gains anyone.
What would SIAI be willing to lie to donors about?
Although this is a reasonable question to want the answer to, it's obvious even to me that answering at all would be silly and no sensible person who had the answer would.
Investigating the logic or lac...
Okay, you can leave it abstract. Here's what I was hoping to have explained: why were you discussing what people would really be prepared to sacrifice?
... and not just the surface level of "just for fun," but also considering how these "just for fun" games get started, and what they do to enforce cohesion in a group.
Ahh. I was trying to ask about Cialdini-style influence techniques.
I think the question you should be asking is less about evil conspiracies, and more about what kind of organization SIAI is -- what would they tell you about, and what would they lie to you about.
I agree that there's a lot in history, but the examples you cited have something that doesn't match here -- historically, you lie to people you don't plan on cooperating with later.
If you lie to an oppressive government, it's okay because it'll either get overthrown or you'll never want to cooperate with it (so great is your reason for lying).
Lying to your donor pool is very, very different than lying to the Nazis about hiding jews.
As you may know from your study of marketing, accusations stick in the mind even when one is explicitly told they are false. In the parent comment and a sibling, you describe a hypothetical SIAI lying to its donors because... Roko had some conversations with Carl that led you to believe we care strongly about existential risk reduction?
If your aim is to improve SIAI, to cause there to be good organizations in this space, and/or to cause Less Wrong-ers to have accurate info, you might consider:
First off, great comment -- interesting, and complex.
But, some things still don't make sense to me...
Assuming that what you described led to:
I was once criticized by a senior singinst member for not being prepared to be tortured or raped for the cause. I mean not actually, but, you know, in theory. Precommiting to being prepared to make a sacrifice that big. shrugs
How did precommitting enter in to it?
Are you prepared to be tortured or raped for the cause? Have you precommitted to it?
Have other SIAI people you know of talked about this with you, ha
I find this whole line of conversation fairly ludicrous, but here goes:
Number 1. Time-inconsistency: we have different reactions about an immediate certainty of some bad than a future probability of it. So many people might be willing to go be a health worker in a poor country where aid workers are commonly (1 in 10,000) raped or killed, even though they would not be willing to be certainly attacked in exchange for 10,000 times the benefits to others. In the actual instant of being tortured anyone would break, but people do choose courses of action that ca...
Sorry to see this so heavily downvoted. Thanks -- this made for interesting reading and watching.
If you haven't checked out the archive of iq.org it's also a rather interesting blog :)
re: formatting... you don't happen to use Ubuntu/Chrome, do you?