Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

FormallyknownasRoko comments on Best career models for doing research? - Less Wrong

27 Post author: Kaj_Sotala 07 December 2010 04:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (999)

You are viewing a single comment's thread. Show more comments above.

Comment author: FormallyknownasRoko 09 December 2010 10:17:05PM 10 points [-]

May I at this point point out that I agree that the post in question should not appear in public. Therefore, it is a question of the author's right to retract material, not of censorship.

Comment author: wedrifid 09 December 2010 10:23:54PM 8 points [-]

Therefore, it is a question of the author's right to retract material, not of censorship.

That does not actually follow. Censoring what other people say is still censoring even if you happened to have said related things previously.

Comment author: Vladimir_Nesov 09 December 2010 10:32:09PM 0 points [-]

Yes, this other aspect.

Comment author: waitingforgodel 09 December 2010 10:27:43PM 4 points [-]

In this case, the comment censored was not posted by you. Therefore you're not the author.

FYI the actual author didn't even know it was censored.

Comment author: Vladimir_Nesov 09 December 2010 10:22:25PM *  2 points [-]

Good idea. We should've started using this standard reference when the censorship complaints began, but at least henceforth.

Comment author: FormallyknownasRoko 09 December 2010 10:23:37PM 0 points [-]

Yes. THIS IS NOT CENSORSHIP. Just in case anyone missed it.

Comment author: timtyler 09 December 2010 10:28:15PM *  4 points [-]

FWIW, loads of my comments were apparently deleted by administrators at the time.

Comment author: wedrifid 09 December 2010 10:34:43PM *  4 points [-]

FWIW, loads of my comments were deleted by administrators at the time.

I was away for a couple of months while the incident took place and when I returned I actually used your user page to reconstruct most of the missing conversation (with blanks filled from other user pages and an alternate source). Yours was particularly useful because of how prolific you were with quoting those to whom you were replying. I still have ten pages of your user comments stored on my harddrive somewhere. :)

Comment author: timtyler 09 December 2010 10:43:33PM *  4 points [-]

Yes, some weren't fully deleted, but - IIRC - others were. If I am remembering this right, the first deleted post (Roko's) left comments behind in people's profiles, but with the second deleted post the associated comments were rendered completely inaccessible to everyone. At the time, I figured that the management was getting better at nuking people's posts.

After that - rather curiously - some of my subsequent "marked deleted" posts remained visible to me when logged in - so I wasn't even aware of what had been "marked deleted" to everyone else for most of the time - unless I logged out of the site.

Comment author: wedrifid 09 December 2010 10:27:33PM *  14 points [-]

You are evidently confused about what the word means. The systematic deletion of any content that relates to an idea that the person with power does not wish to be spoken is censorship in the same way that threatening to (probabilistically) destroy humanity is terrorism. As in, blatantly obviously - it's just what the words happen to mean.

Going around saying 'this isn't censorship' while doing it would trigger all sorts of 'crazy cult' warning bells.

Comment author: fortyeridania 10 December 2010 10:17:55AM 6 points [-]

Yes, the acts in question can easily be denoted by the terms "blackmail" and "censorship." And your final sentence is certainly true as well.

To avoid being called a cult, to avoid being a cult, and to avoid doing bad things generally, we should stop the definition debate and focus on whether people's behavior has been appropriate. If connotation conundrums keep you quarreling about terms, pick variables (e.g. "what EY did"=E and "what WFG precommitted to doing, and in fact did"=G) and keep talking.

Comment author: CarlShulman 09 December 2010 10:25:18PM 3 points [-]

That could only apply to your original post, not subsequent stuff.

Comment author: FormallyknownasRoko 09 December 2010 10:26:55PM 4 points [-]

Right. Bottle. Genie.

Comment author: waitingforgodel 09 December 2010 10:28:29PM 6 points [-]

YES IT IS. In case anyone missed it. It isn't Roko's post we're talking about right now

Comment author: FormallyknownasRoko 09 December 2010 10:37:59PM 4 points [-]

There is still a moral sense in which if, after careful thought, I decided that that material should not have been posted, then any posts which resulted solely from my post are in a sense a violation of my desire to not have posted it. Especially if said posts operate under the illusion that my original post was censored rather than retracted.

But in reality such ideas tend to propagate like the imp of the perverse: a gnawing desire to know what the "censored" material is, even if everyone who knows what it is has subsequently decided that they wished they didn't! E.g both me and Nesov have been persuaded (once fully filled in) that this is really nasty stuff and shouldn't be let out. (correct me if I am wrong).

This "imp of the perverse" property is actually part of the reason why the original post is harmful. In a sense, this is an idea-virus which makes people who don't yet have it want to have it, but as soon as they have been exposed to it, they (belatedly) realize they really didn't want to know about it or spread it.

Sigh.

Comment author: XiXiDu 10 December 2010 01:46:35PM *  9 points [-]

The only people who seem to be filled in are you and Yudkowsky. I think Nesov just argues against it based on some very weak belief. As far as I can tell, I got all the material in question. The only possible reason I can see for why one wouldn't want to spread it is that its negative potential does outweigh its very-very-low-probability (and that only if you accept a long chain of previous beliefs). It doesn't. It also isn't some genuine and brilliant idea that all this mystery mongering makes it seem to be. Everyone I sent it just laughed about it. But maybe you can fill me in?

Comment author: WrongBot 10 December 2010 04:39:43PM 5 points [-]

If the idea is dangerous in the first place (which is very unlikely), it is only dangerous to people who understand it, because understanding it makes you vulnerable. The better you understand it and the more you think about it, the more vulnerable you become. In hindsight, I would prefer to never have read about the idea in question.

I don't think this is a big issue, considering the tiny probability that the scenario will ever occur, but I am glad that discussing it continues to be discouraged and would appreciate it if people stopped needlessly resurrecting it over and over again.

Comment author: Vaniver 10 December 2010 05:14:16PM 7 points [-]

If the idea is dangerous in the first place (which is very unlikely), it is only dangerous to people who understand it, because understanding it makes you vulnerable.

This strikes me as tautological and/or confusing definitions. I'm happy to agree that the idea is dangerous to people who think it is dangerous, but I don't think it's dangerous and I think I understand it. To make an analogy, I understand the concept of hell but don't think it's dangerous, and so the concept of hell does not bother me. Does the fact that I do not have the born-again Christian's fear of hell mean that they understand it and I don't? I don't see why it should.

Comment author: WrongBot 10 December 2010 05:37:05PM 5 points [-]

I can't figure out a way to explain this further without repropagating the idea, which I will not do. It is likely that there are one or more pieces of the idea which you are not familiar with or do not understand, and I envy your epistemological position.

Comment author: Aharon 11 December 2010 07:11:54PM *  1 point [-]

Yes, but the concept of hell is easier to understand. From what I have read in the discussions, I have no idea how the Basilisk is supposed to work, while it's quite easy to understand how hell is supposed to work.

Comment author: XiXiDu 10 December 2010 06:31:17PM 1 point [-]

I would prefer to never have read about the idea in question.

If you people are this worried about reality, why don't you work to support creating a Paperclip maximizer? It would have a lot of fun doing what it wants to do and everyone else would quickly die. Nobody ever after would have to fear what could possible happen to them at some point.

If you people want to try to turn the universe into a better place, at whatever cost, then why do you worry or wish to not know about potential obstacles? Both is irrational.

The forbidden topic seems to be a dangerous Ugh field for a lot of people here. You have to decide what you want and then follow through on it. Any self-inflicted pain just adds to the overall negative.

Comment author: WrongBot 10 December 2010 06:53:02PM 5 points [-]

You do not understand what you are talking about.

The basilisk idea has no positive value. All it does is cause those who understand it to bear a very low probability of suffering incredible disutility at some point in the future. Explaining this idea to someone does them about as much good as slashing their tires.

Comment author: XiXiDu 10 December 2010 07:43:25PM 4 points [-]

The basilisk idea has no positive value. All it does is cause those who understand it to bear a very low probability of suffering incredible disutility at some point in the future.

I understand that but do not see that the description applies to the idea in question, insofar as it is in my opinion no more probable than fiction and that any likelihood is being outweighed by opposing ideas. There are however other well-founded ideas, free speech and transparency, that are being ignored. I also believe that people would benefit from talking about it and possible overcome and ignore it subsequently.

But I'm tired of discussing this topic and will do you the favor to shut up about it. But remember that I haven't been the one who started this thread. It was Roko and whoever asked to delete Roko's comment.

Comment author: FormallyknownasRoko 10 December 2010 04:58:32PM 0 points [-]

Upvoted, agree strongly.

Comment author: FormallyknownasRoko 10 December 2010 05:06:28PM *  4 points [-]

Look, you have three people all of whom think it is a bad idea to spread this. All are smart. Two initially thought it was OK to spread it.

Furthermore, I would add that I wish I had never learned about any of these ideas. In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity; I wish very strongly that my mind had never come across the tools to inflict such large amounts of potential self-harm with such small durations of inattention, uncautiousness and/or stupidity, even if it is all premultiplied by a small probability. (not a very small one, mind you. More like 1/500 type numbers here)

If this is not enough warning to make you stop wanting to know more, then you deserve what you get.

Comment author: Desrtopa 10 December 2010 05:20:22PM 4 points [-]

Considering the extraordinary appeal that forbidden knowledge has even for the average person, let alone the exceptionally intellectually curious, I don't think this is a very effective way to warn a person off of seeking out the idea in question. Far from deserving what they get, such a person is behaving in a completely ordinary manner, to exceptionally severe consequence.

Personally, I don't want to know about the idea (at least not if it's impossible without causing myself significant psychological distress to no benefit,) but I've also put significant effort into training myself out of responses such as automatically clicking links to shock sites that say "Don't click this link!"

Comment author: Vaniver 10 December 2010 05:23:22PM 8 points [-]

Look, you have three people all of whom think it is a bad idea to spread this. All are smart. Two initially thought it was OK to spread it.

I see a lot more than three people here, most of whom are smart, and most of them think that Langford basilisks are fictional, and even if they aren't, censoring them is the wrong thing to do. You can't quarantine the internet, and so putting up warning signs makes more people fall into the pit.

Comment author: katydee 10 December 2010 06:09:43PM *  3 points [-]

I saw the original idea and the discussion around it, but I was (fortunately) under stress at the time and initially dismissed it as so implausible as to be unworthy of serious consideration. Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I'm not so sure. I do know enough to say that, if the idea is something that should be taken seriously, it's really serious. I can tell you that I am quite happy that the original posts are no longer present, because if they were I am moderately confident that I would want to go back and see if I could make more sense out of the matter, and if Eliezer, Alicorn, and Roko are right about this, making sense out of the matter would be seriously detrimental to my health.

Thankfully, either it's a threat but I don't understand it fully, in which case I'm safe, or it's not a threat, in which case I'm also safe. But I am sufficiently concerned about the possibility that it's a threat that I don't understand fully but might be able to realize independently given enough thought that I'm consciously avoiding extended thought about this matter. I will respond to posts that directly relate to this one but am otherwise done with this topic-- rest assured that, if you missed this one, you're really quite all right for it!

Comment author: Vaniver 10 December 2010 06:21:41PM 5 points [-]

Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I'm not so sure.

This line of argument really bothers me. What does it mean for E, A, and R to seem very intelligent? As far as I can tell, the necessary conclusion is "I will believe a controversial statement of theirs without considering it." When you word it like that, the standards are a lot higher than "seem very intelligent", or at least narrower- you need to know their track record on decisions like this.

(The controversial statement is "you don't want to know about X," not X itself, by the way.)

Comment author: FormallyknownasRoko 10 December 2010 05:34:09PM *  0 points [-]

Whatever man, go ahead and make your excuses, you have been warned.

Comment author: Vaniver 10 December 2010 05:41:37PM 8 points [-]

I have not only been warned, but I have stared the basilisk in the eyes, and I'm still here typing about it. In fact, I have only cared enough to do so because it was banned, and I wanted the information on how dangerous it was to judge the wisdom of the censorship.

On a more general note, being terrified of very unlikely terrible events is a known human failure mode. Perhaps it would be more effective at improving human rationality to expose people to ideas like this with the sole purpose of overcoming that sort of terror?

Comment author: XiXiDu 10 December 2010 05:59:50PM 14 points [-]

I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity;

I wish you'd talk to someone other than Yudkowsky about this. You don't need anyone to harm you, you already seem to harm yourself. You indulge yourself in self-inflicted psychological stress. As Seneca said, "there are more things that terrify us than there are that oppress us, and we suffer more often in opinion than in reality". You worry and pay interest for debt that will likely never be made.

Look, you have three people all of whom think it is a bad idea to spread this. All are smart.

I read about quite a few smart people who hold idiot beliefs, I only consider this to be marginal evidence.

Furthermore, I would add that I wish I had never learned about any of these ideas.

You'd rather be some ignorant pleasure maximizing device? For me truth is the most cherished good.

If this is not enough warning to make you stop wanting to know more, then you deserve what you get.

BS.

Comment author: FormallyknownasRoko 10 December 2010 06:04:07PM 5 points [-]

For me truth is the most cherished good.

More so than not opening yourself up to a small risk of severe consequences? E.g. if you found a diary that clearly belonged to some organized crime boss, would you open it up and read it? I see this situation as analogous.

Comment deleted 10 December 2010 06:44:15PM [-]
Comment author: Manfred 10 December 2010 07:39:06PM 3 points [-]

Really thought you were going to go with Tom Riddle on this one. Perfect line break for it :)

Comment author: timtyler 10 December 2010 06:53:39PM *  1 point [-]

I would add that I wish I had never learned about any of these ideas. In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity;

Hmm. It is tricky to go back, I would imagine.

The material does come with some warnings, I believe. For instance, consider this one:

"Beware lest Friendliness eat your soul." - Eliezer Yudkowsky

Comment author: Vladimir_Nesov 10 December 2010 05:43:29PM 1 point [-]

In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity

As I understand, you donate (and plan to in the future) to existential risk charities, and that is one of the consequences of you having come across that link. How does this compute into net negative, in your estimation, or are you answering a different question?

Comment author: FormallyknownasRoko 10 December 2010 06:01:07PM -1 points [-]

Sure I want to donate. But if you express it as a hypothetical choice between being a person who didn't know about any of this and had no way of finding out, versus what I have now, I choose the former. Though since that is not an available choice, it is a somewhat academic question.

Comment author: XiXiDu 10 December 2010 06:48:51PM *  4 points [-]

But if you express it as a hypothetical choice between being a person who didn't know about any of this and had no way of finding out, versus what I have now, I choose the former.

I can't believe to hear this from a person who wrote about Ugh fields. I can't believe to read a plead for ignorance on a blog devoted to refining rationality. Ignorance is bliss, is that the new motto now?

Comment author: Vladimir_Nesov 10 December 2010 06:07:35PM 0 points [-]

Not helping. I was referring to the the moral value of donations as an argument for choosing to know, as opposed to not knowing. You don't seem to address that in your reply (did I miss something?).

Comment author: timtyler 10 December 2010 06:19:41PM 0 points [-]

The only possible reason I can see for why one wouldn't want to spread it is that its negative potential does outweigh its very-very-low-probability (and that only if you accept a long chain of previous beliefs).

I gather doing so would irritate our site's host and moderator.

Comment author: Vladimir_Nesov 09 December 2010 10:43:54PM *  2 points [-]

E.g both me and Nesov have been persuaded (once fully filled in) that this is really nasty stuff and shouldn't be let out.

I wasn't "filled in", and I don't know whether my argument coincides with Eliezer's. I also don't understand why he won't explain his argument, if it's the same as mine, now that content is in the open (but it's consistent with, that is responds to the same reasons as, continuing to remove comments pertaining to the topic of the post, which makes it less of a mystery).

Comment author: FormallyknownasRoko 09 December 2010 10:47:42PM 2 points [-]

But you think that it is not a good thing for this to propagate more?

Comment author: Vladimir_Nesov 09 December 2010 10:51:38PM *  3 points [-]

As a decision on expected utility under logical uncertainty, but extremely low confidence, yes. I can argue that it most certainly won't be a bad thing (which I even attempted in comments to the post itself, my bad), the expectation of it being a bad thing derives from remaining possibility of those arguments failing. As Carl said, "that estimate is unstable in the face of new info" (which refers to his own argument, not necessarily mine).

Comment deleted 10 December 2010 02:39:53PM *  [-]
Comment author: Vladimir_Nesov 10 December 2010 02:47:18PM *  0 points [-]

This analogy makes sense if you assume the conclusion that the argument for the post being a Basilisk is incorrect, but not as an argument for convincing people that it's incorrect. To evaluate whether the argument is correct, you have to study the argument itself, there is no royal road (the conclusion can be studied in other ways, since particular proof can't be demanded).

(See this summary of the structure of the argument.)