FormallyknownasRoko comments on Best career models for doing research? - Less Wrong

27 Post author: Kaj_Sotala 07 December 2010 04:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (999)

You are viewing a single comment's thread. Show more comments above.

Comment author: FormallyknownasRoko 09 December 2010 10:37:59PM 4 points [-]

There is still a moral sense in which if, after careful thought, I decided that that material should not have been posted, then any posts which resulted solely from my post are in a sense a violation of my desire to not have posted it. Especially if said posts operate under the illusion that my original post was censored rather than retracted.

But in reality such ideas tend to propagate like the imp of the perverse: a gnawing desire to know what the "censored" material is, even if everyone who knows what it is has subsequently decided that they wished they didn't! E.g both me and Nesov have been persuaded (once fully filled in) that this is really nasty stuff and shouldn't be let out. (correct me if I am wrong).

This "imp of the perverse" property is actually part of the reason why the original post is harmful. In a sense, this is an idea-virus which makes people who don't yet have it want to have it, but as soon as they have been exposed to it, they (belatedly) realize they really didn't want to know about it or spread it.

Sigh.

Comment author: XiXiDu 10 December 2010 01:46:35PM *  9 points [-]

The only people who seem to be filled in are you and Yudkowsky. I think Nesov just argues against it based on some very weak belief. As far as I can tell, I got all the material in question. The only possible reason I can see for why one wouldn't want to spread it is that its negative potential does outweigh its very-very-low-probability (and that only if you accept a long chain of previous beliefs). It doesn't. It also isn't some genuine and brilliant idea that all this mystery mongering makes it seem to be. Everyone I sent it just laughed about it. But maybe you can fill me in?

Comment author: WrongBot 10 December 2010 04:39:43PM 5 points [-]

If the idea is dangerous in the first place (which is very unlikely), it is only dangerous to people who understand it, because understanding it makes you vulnerable. The better you understand it and the more you think about it, the more vulnerable you become. In hindsight, I would prefer to never have read about the idea in question.

I don't think this is a big issue, considering the tiny probability that the scenario will ever occur, but I am glad that discussing it continues to be discouraged and would appreciate it if people stopped needlessly resurrecting it over and over again.

Comment author: Vaniver 10 December 2010 05:14:16PM 7 points [-]

If the idea is dangerous in the first place (which is very unlikely), it is only dangerous to people who understand it, because understanding it makes you vulnerable.

This strikes me as tautological and/or confusing definitions. I'm happy to agree that the idea is dangerous to people who think it is dangerous, but I don't think it's dangerous and I think I understand it. To make an analogy, I understand the concept of hell but don't think it's dangerous, and so the concept of hell does not bother me. Does the fact that I do not have the born-again Christian's fear of hell mean that they understand it and I don't? I don't see why it should.

Comment author: WrongBot 10 December 2010 05:37:05PM 5 points [-]

I can't figure out a way to explain this further without repropagating the idea, which I will not do. It is likely that there are one or more pieces of the idea which you are not familiar with or do not understand, and I envy your epistemological position.

Comment author: Aharon 11 December 2010 07:11:54PM *  1 point [-]

Yes, but the concept of hell is easier to understand. From what I have read in the discussions, I have no idea how the Basilisk is supposed to work, while it's quite easy to understand how hell is supposed to work.

Comment author: XiXiDu 10 December 2010 06:31:17PM 1 point [-]

I would prefer to never have read about the idea in question.

If you people are this worried about reality, why don't you work to support creating a Paperclip maximizer? It would have a lot of fun doing what it wants to do and everyone else would quickly die. Nobody ever after would have to fear what could possible happen to them at some point.

If you people want to try to turn the universe into a better place, at whatever cost, then why do you worry or wish to not know about potential obstacles? Both is irrational.

The forbidden topic seems to be a dangerous Ugh field for a lot of people here. You have to decide what you want and then follow through on it. Any self-inflicted pain just adds to the overall negative.

Comment author: WrongBot 10 December 2010 06:53:02PM 5 points [-]

You do not understand what you are talking about.

The basilisk idea has no positive value. All it does is cause those who understand it to bear a very low probability of suffering incredible disutility at some point in the future. Explaining this idea to someone does them about as much good as slashing their tires.

Comment author: XiXiDu 10 December 2010 07:43:25PM 4 points [-]

The basilisk idea has no positive value. All it does is cause those who understand it to bear a very low probability of suffering incredible disutility at some point in the future.

I understand that but do not see that the description applies to the idea in question, insofar as it is in my opinion no more probable than fiction and that any likelihood is being outweighed by opposing ideas. There are however other well-founded ideas, free speech and transparency, that are being ignored. I also believe that people would benefit from talking about it and possible overcome and ignore it subsequently.

But I'm tired of discussing this topic and will do you the favor to shut up about it. But remember that I haven't been the one who started this thread. It was Roko and whoever asked to delete Roko's comment.

Comment author: FormallyknownasRoko 10 December 2010 04:58:32PM 0 points [-]

Upvoted, agree strongly.

Comment author: FormallyknownasRoko 10 December 2010 05:06:28PM *  4 points [-]

Look, you have three people all of whom think it is a bad idea to spread this. All are smart. Two initially thought it was OK to spread it.

Furthermore, I would add that I wish I had never learned about any of these ideas. In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity; I wish very strongly that my mind had never come across the tools to inflict such large amounts of potential self-harm with such small durations of inattention, uncautiousness and/or stupidity, even if it is all premultiplied by a small probability. (not a very small one, mind you. More like 1/500 type numbers here)

If this is not enough warning to make you stop wanting to know more, then you deserve what you get.

Comment author: Desrtopa 10 December 2010 05:20:22PM 4 points [-]

Considering the extraordinary appeal that forbidden knowledge has even for the average person, let alone the exceptionally intellectually curious, I don't think this is a very effective way to warn a person off of seeking out the idea in question. Far from deserving what they get, such a person is behaving in a completely ordinary manner, to exceptionally severe consequence.

Personally, I don't want to know about the idea (at least not if it's impossible without causing myself significant psychological distress to no benefit,) but I've also put significant effort into training myself out of responses such as automatically clicking links to shock sites that say "Don't click this link!"

Comment author: Vaniver 10 December 2010 05:23:22PM 8 points [-]

Look, you have three people all of whom think it is a bad idea to spread this. All are smart. Two initially thought it was OK to spread it.

I see a lot more than three people here, most of whom are smart, and most of them think that Langford basilisks are fictional, and even if they aren't, censoring them is the wrong thing to do. You can't quarantine the internet, and so putting up warning signs makes more people fall into the pit.

Comment author: katydee 10 December 2010 06:09:43PM *  3 points [-]

I saw the original idea and the discussion around it, but I was (fortunately) under stress at the time and initially dismissed it as so implausible as to be unworthy of serious consideration. Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I'm not so sure. I do know enough to say that, if the idea is something that should be taken seriously, it's really serious. I can tell you that I am quite happy that the original posts are no longer present, because if they were I am moderately confident that I would want to go back and see if I could make more sense out of the matter, and if Eliezer, Alicorn, and Roko are right about this, making sense out of the matter would be seriously detrimental to my health.

Thankfully, either it's a threat but I don't understand it fully, in which case I'm safe, or it's not a threat, in which case I'm also safe. But I am sufficiently concerned about the possibility that it's a threat that I don't understand fully but might be able to realize independently given enough thought that I'm consciously avoiding extended thought about this matter. I will respond to posts that directly relate to this one but am otherwise done with this topic-- rest assured that, if you missed this one, you're really quite all right for it!

Comment author: Vaniver 10 December 2010 06:21:41PM 5 points [-]

Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I'm not so sure.

This line of argument really bothers me. What does it mean for E, A, and R to seem very intelligent? As far as I can tell, the necessary conclusion is "I will believe a controversial statement of theirs without considering it." When you word it like that, the standards are a lot higher than "seem very intelligent", or at least narrower- you need to know their track record on decisions like this.

(The controversial statement is "you don't want to know about X," not X itself, by the way.)

Comment author: katydee 10 December 2010 06:27:56PM 9 points [-]

I am willing to accept the idea that (intelligent) specialists in a field may know more about their field than nonspecialists and are therefore more qualified to evaluate matters related to their field than I.

Comment author: Vaniver 10 December 2010 06:37:09PM 5 points [-]

Good point, though I would point out that you need E, A, and R to be specialists when it comes to how people react to X, not just X, and I would say there's evidence that's not true.

Comment author: katydee 10 December 2010 06:44:08PM *  1 point [-]

I agree, but I know what conclusion I would draw from the belief in question if I actually believed it, so the issue of their knowledge of how people react is largely immaterial to me in particular. I was mostly posting to provide a data point in favor of keeping the material off LW, not to attempt to dissolve the issue completely or anything.

Comment author: Vladimir_Nesov 10 December 2010 06:27:07PM 0 points [-]

When you word it like that, the standards are a lot higher than "seem very intelligent", or at least narrower- you need to know their track record on decisions like this.

You don't need any specific kind of proof, you already have some state of knowledge about correctness of such statements. There is no "standard of evidence" for forming a state of knowledge, it just may be that without the evidence that meets that "standard" you don't expect to reach some level of certainty, or some level of stability of your state of knowledge (i.e. low expectation of changing your mind).

Comment author: FormallyknownasRoko 10 December 2010 05:34:09PM *  0 points [-]

Whatever man, go ahead and make your excuses, you have been warned.

Comment author: Vaniver 10 December 2010 05:41:37PM 8 points [-]

I have not only been warned, but I have stared the basilisk in the eyes, and I'm still here typing about it. In fact, I have only cared enough to do so because it was banned, and I wanted the information on how dangerous it was to judge the wisdom of the censorship.

On a more general note, being terrified of very unlikely terrible events is a known human failure mode. Perhaps it would be more effective at improving human rationality to expose people to ideas like this with the sole purpose of overcoming that sort of terror?

Comment author: Jack 10 December 2010 06:31:19PM *  5 points [-]

I'll just second that I also read it a while back (though after it was censored) and thought that it was quite interesting but wrong on multiple levels. Not 'probably wrong' but wrong like an invalid logic proof is wrong (though of course I am not 100% certain of anything). My main concern about the censorship is that not talking about what was wrong with the argument will allow the proliferation of the reasoning errors that left people thinking the conclusion was plausible. There is a kind of self-fulfilling prophesy involved in not recognizing these errors which is particularly worrying.

Comment author: JGWeissman 11 December 2010 01:58:40AM 7 points [-]

Consider this invalid proof that 1 = 2:

1. Let x = y
2. x^2 = x*y
3. x^2 - y^2 = x*y - y^2
4. (x - y)*(x + y) = y*(x - y)
5. x + y = y
6. y + y = y (substitute using 1)
7. 2y = y
8. 2 = 1

You could refute this by pointing out that step (5) involved division by (x - y) = (y - y) = 0, and you can't divide by 0.

But imagine if someone claimed that the proof is invalid because "you can't represent numbers with letters like 'x' and 'y'". You would think that they don't understand what is actually wrong with it, or why someone might mistakenly believe it. This is basically my reaction to everyone I have seen oppose the censorship because of some argument they present that the idea is wrong and no one would believe it.

Comment author: Jack 11 December 2010 03:11:11AM *  2 points [-]

I'm actually not sure if I understand your point. Either it is a round-about way of making it or I'm totally dense and the idea really is dangerous (or some third option).

It's not that the idea is wrong and no one would believe it, it's that the idea is wrong and when presented with with the explanation for why it's wrong no one should believe it. In addition, it's kind of important that people understand why it's wrong. I'm sympathetic to people with different minds that might have adverse reactions to things I don't but the solution to that is to warn them off, not censor the topics entirely.

Comment author: JGWeissman 11 December 2010 03:26:39AM 1 point [-]

Yes, the idea really is dangerous.

it's that the idea is wrong and when presented with with the explanation for why it's wrong no one should believe it.

And for those who understand the idea, but not why it is wrong, nor the explanation of why it is wrong?

the solution to that is to warn them off, not censor the topics entirely.

This is a politically reinforced heuristic that does not work for this problem.

Comment deleted 11 December 2010 06:04:46AM [-]
Comment author: Vladimir_Nesov 10 December 2010 05:53:19PM 2 points [-]

I have not only been warned, but I have stared the basilisk in the eyes, and I'm still here typing about it.

This isn't evidence about that hypothesis, it's expected that most certainly nothing happens. Yet you write for rhetorical purposes as if it's supposed to be evidence against the hypothesis. This constitutes either lying or confusion (I expect it's unintentional lying, with phrases produced without conscious reflection about their meaning, so a little of both lying and confusion).

Comment author: Jack 10 December 2010 06:05:56PM 5 points [-]

The sentence of Vaniver's you quote seems like a straight forward case of responding to hyperbole with hyperbole in kind.

Comment author: Vladimir_Nesov 10 December 2010 06:11:23PM 1 point [-]

That won't be as bad-intentioned, but still as wrong and deceptive.

Comment author: shokwave 10 December 2010 06:10:49PM 2 points [-]

I have not only been warned, but I have stared the basilisk in the eyes, and I'm still here typing about it.

The point we are trying to make is that we think the people who stared the basilisk in the eyes and metaphorically turned to stone are stronger evidence.

Comment author: Vaniver 10 December 2010 06:32:13PM 8 points [-]

The point we are trying to make is that we think the people who stared the basilisk in the eyes and metaphorically turned to stone are stronger evidence.

I get that. But I think it's important to consider both positive and negative evidence- if someone's testimony that they got turned to stone is important, so are the testimonies of people who didn't get turned to stone.

The question to me is whether the basilisk turns people to stone or people turn themselves into stone. I prefer the second because it requires no magic powers on the part of the basilisk. It might be that some people turn to stone when they see goatse for the first time, but that tells you more about humans and how they respond to shock than about goatse.

Indeed, that makes it somewhat useful to know what sort of things shock other people. Calling this idea 'dangerous' instead of 'dangerous to EY" strikes me as mind projection.

Comment author: shokwave 10 December 2010 07:14:37PM 1 point [-]

But I think it's important to consider both positive and negative evidence- if someone's testimony that they got turned to stone is important, so are the testimonies of people who didn't get turned to stone.

I am considering both.

It might be that some people turn to stone when they see goatse for the first time, but that tells you more about humans and how they respond to shock than about goatse.

I generally find myself in support of people who advocate a policy of keeping people from seeing Goatse.

Comment author: Vaniver 11 December 2010 12:35:39AM 3 points [-]

I generally find myself in support of people who advocate a policy of keeping people from seeing Goatse.

I'm not sure how to evaluate this statement. What do you mean by "keeping people from seeing Goatse"? Banning? Voluntarily choosing not to spread it? A filter like the one proposed in Australia that checks every request to the outside world?

Comment author: Vladimir_Nesov 10 December 2010 06:14:13PM 1 point [-]

I don't understand this. (Play on conservation of expected evidence? In what way?)

Comment author: shokwave 10 December 2010 06:30:56PM 4 points [-]

Normal updating.

  • Original prior for basilik-danger.
  • Eliezer_Yudkowsky stares at basilisk, turns to stone (read: engages idea, decides to censor). Revise pr(basilisk-danger) upwards.
  • FormallyknownasRoko stares at basilisk, turns to stone (read: appears to truly wish e had never thought it). Revise pr(basilisk-danger) upwards.
  • Vladimir_Nesov stares at basilisk, turns to stone (read: engages idea, decides it is dangerous). Revise pr(basilisk-danger) upwards.
  • Vaniver stares at basilisk, is unharmed (read: engages idea, decides it is not dangerous). Revise pr(basilisk-danger) downwards.
  • Posterior is higher than original prior.

For the posterior to equal or lower than the prior, Vaniver would have to be more a rationalist than Eliezer, Roko, and you put together.

Comment author: Jack 10 December 2010 07:16:46PM 7 points [-]

Okay, but more than four people have engaged with the idea. Should we take a poll?

The problem of course is that majorities often believe stupid things. That is why a free marketplace of ideas free from censorship is a really good thing! The obvious thing to do is exchange information until agreement but we can't do that, at least not here.

Also, the people who think it should be censored all seem to disagree about how dangerous the idea really is, suggesting it isn't clear how it is dangerous. It also seems plausible that some people have influenced the thinking of other people- for example it looks like Roko regretted posting after talking to Eliezer. While Roko's regret is evidence that Eliezer is right, it isn't the same as independent/blind confirmation that the idea is dangerous.

Comment author: Vaniver 10 December 2010 07:00:45PM 4 points [-]

For the posterior to equal or lower than the prior, Vaniver would have to be more a rationalist than Eliezer, Roko, and you put together.

How many of me would there have to be for that to work?

Also, why is rationalism the risk factor for this basilisk? Maybe the basilisk only turns to stone people with brown eyes (or the appropriate mental analog).

Comment deleted 10 December 2010 06:54:55PM [-]
Comment author: Vladimir_Nesov 10 December 2010 06:58:25PM *  2 points [-]

Eliezer_Yudkowsky stares at basilisk, turns to stone (read: engages idea, decides to censor). Revise pr(basilisk-danger) upwards.

This equivocates the intended meaning of turning to stone in the original discussion you replied to. Fail. (But I understand what you meant now.)

Comment author: TheOtherDave 10 December 2010 05:46:01PM 1 point [-]

Perhaps it would be more effective at improving human rationality to expose people to ideas like this with the sole purpose of overcoming that sort of terror?

You would need a mechanism for actually encouraging them to "overcome" the terror, rather than reinforce it. Otherwise you might find that your subjects are less rational after this process than they were before.

Comment author: Vaniver 10 December 2010 06:09:40PM 0 points [-]

Right- and current methodologies when it comes to that sort of therapy are better done in person than over the internet.

Comment author: FormallyknownasRoko 10 December 2010 05:58:02PM 0 points [-]

being terrified of very unlikely terrible events is a known human failure mode

one wonders how something like that might have evolved, doesn't one? What happened to all the humans who came with the mutation that made them want to find out whether the sabre-toothed tiger was friendly?

Comment author: Kingreaper 10 December 2010 06:29:21PM *  7 points [-]

one wonders how something like that might have evolved, doesn't one? What happened to all the humans who came with the mutation that made them want to find out whether the sabre-toothed tiger was friendly?

I don't see how very unlikely events that people knew the probability of would have been part of the evolutionary environment at all.

In fact, I would posit that the bias is most likely due to having a very high floor for probability. In the evolutionary environment things with probability you knew to be <1% would be unlikely to ever be brought to your attention. So not having any good method for intuitively handling probabilities between 1% and zero would be expected.

In fact, I don't think I have an innate handle on probability to any finer grain than ~10% increments. Anything more than that seems to require mathematical thought.

Comment author: FormallyknownasRoko 10 December 2010 06:32:22PM 0 points [-]

Probably less than 1% of cave-men died by actively seeking out the sabre-toothed tiger to see if it was friendly. But I digress.

Comment author: Kingreaper 10 December 2010 06:34:48PM *  8 points [-]

But probably far more than 1% of cave-men who chose to seek out a sabre-tooth tiger to see if they were friendly died due to doing so.

The relevant question on an issue of personal safety isn't "What % of the population die due to trying this?"

The relevant question is: "What % of the people who try this will die?"

In the first case, rollerskating downhill, while on fire, after having taken arsenic would seem safe (as I suspect no-one has ever done precisely that)

Comment author: Vaniver 10 December 2010 06:08:32PM 7 points [-]

one wonders how something like that might have evolved, doesn't one?

No, really, one doesn't wonder. It's pretty obvious. But if we've gotten to the point where "this bias paid off in the evolutionary environment!" is actually used as an argument, then we are off the rails of refining human rationality.

Comment author: FormallyknownasRoko 10 December 2010 06:17:43PM *  2 points [-]

What's wrong with using "this bias paid off in the evolutionary environment!" as an argument? I think people who paid more attention to this might make fewer mistakes, especially in domains where there isn't a systematic, exploitable difference between EEA and now.

The evolutionary environment contained enetities capable of dishing out severe punishments, unertainty, etc.

If anything, I think that the heuristic that an idea "obviously" can't be dangerous is the problem, not the heuristic that one should take care around possibilities of strong penalites.

Comment author: timtyler 10 December 2010 06:25:01PM *  4 points [-]

It is a fine argument for explaining the widespread occcurrence of fear. However, today humans are in an environment where their primitive paranoia is frequently triggered by inappropriate stimulii.

Dan Gardener goes into this in some detail in his book: Risk: The Science and Politics of Fear

Video of Dan discussing the topic: Author Daniel Gardner says Americans are the healthiest and safest humans in the world, but are irrationally plagued by fear. He talks with Maggie Rodriguez about his book 'The Science Of Fear.'

Comment author: Emile 10 December 2010 07:08:18PM *  3 points [-]

Someone's been reading Terry Pratchett.

He always held that panic was the best means of survival. Back in the old days, his theory went, people faced with hungry sabre-toothed tigers could be divided into those who panicked and those who stood there saying, "What a magnificent brute!" or "Here pussy".

Comment author: XiXiDu 10 December 2010 05:59:50PM 14 points [-]

I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity;

I wish you'd talk to someone other than Yudkowsky about this. You don't need anyone to harm you, you already seem to harm yourself. You indulge yourself in self-inflicted psychological stress. As Seneca said, "there are more things that terrify us than there are that oppress us, and we suffer more often in opinion than in reality". You worry and pay interest for debt that will likely never be made.

Look, you have three people all of whom think it is a bad idea to spread this. All are smart.

I read about quite a few smart people who hold idiot beliefs, I only consider this to be marginal evidence.

Furthermore, I would add that I wish I had never learned about any of these ideas.

You'd rather be some ignorant pleasure maximizing device? For me truth is the most cherished good.

If this is not enough warning to make you stop wanting to know more, then you deserve what you get.

BS.

Comment author: FormallyknownasRoko 10 December 2010 06:04:07PM 5 points [-]

For me truth is the most cherished good.

More so than not opening yourself up to a small risk of severe consequences? E.g. if you found a diary that clearly belonged to some organized crime boss, would you open it up and read it? I see this situation as analogous.

Comment deleted 10 December 2010 06:44:15PM [-]
Comment author: FormallyknownasRoko 10 December 2010 07:23:36PM *  8 points [-]

Well I guess this is our true point of disagreement. I went to the effort of finding out a lot, went to SIAI and Oxford to learn even more, and in the end I am left seriously disappointed by all this knowedge. In the end it all boils down to:

"most people are irrational, hypocritical and selfish, if you try and tell them they shoot the messenger, and if you try and do anything you bear all the costs, internalize only tiny fractions of the value created if you succeed, and you almost certainly fail to have an effect anyway. And by the way the future is an impending train wreck"

I feel quite strongly that this knowledge is not a worthy thing to have sunk 5 years of my life into getting. I don't know, XiXiDu, you might prize such knowledge, including all the specifics of how that works out exactly.

If you really strongly value the specifics of this, then yes you probably would on net benefit from the censored knowledge, the knowledge that was never censored because I never posted it, and the knowledge that I never posted because I was never trusted with it anyway. But you still probably won't get it, because those who hold it correctly infer that the expected value of releasing it is strongly negative from an altruist's perspective.

Comment author: WrongBot 10 December 2010 08:30:22PM 29 points [-]

The future is probably an impending train wreck. But if we can save the train, then it'll grow wings and fly up into space while lightning flashes in the background and Dragonforce play a song about fiery battlefields or something. We're all stuck on the train anyway, so saving it is worth a shot.

I hate to see smart people who give a shit losing to despair. This is still the most important problem and you can still contribute to fixing it.

TL;DR: I want to give you a hug.

Comment author: Eliezer_Yudkowsky 10 December 2010 10:17:58PM 11 points [-]

most people are irrational, hypocritical and selfish, if you try and tell them they shoot the messenger, and if you try and do anything you bear all the costs, internalize only tiny fractions of the value created if you succeed,

So? They're just kids!

(or)

He glanced over toward his shoulder, and said, "That matter to you?"

Caw!

He looked back up and said, "Me neither."

Comment author: FormallyknownasRoko 10 December 2010 10:25:41PM 3 points [-]

I mean I guess I shouldn't complain that you don't find this bothers you, because you are, in fact, helping me by doing what you do and being very good at it, but that doesn't stop it being demotivating for me! I'll see what I can do regarding quant jobs.

Comment author: timtyler 10 December 2010 07:30:50PM *  3 points [-]

That doesn't sound right to me. Indeed, it sounds as though you are depressed :-(

Unsolicited advice over the public internet is rather unlikely to help - but maybe focus for a bit on what you want - and the specifics of how to get to there.

Comment author: Jack 10 December 2010 08:12:49PM 2 points [-]

Upvoted for the excellent summary!

"most people are irrational, hypocritical and selfish, if you try and tell them they shoot the messenger, and if you try and do anything you bear all the costs, internalize only tiny fractions of the value created if you succeed, and you almost certainly fail to have an effect anyway. And by the way the future is an impending train wreck"

Comment author: katydee 10 December 2010 08:14:28PM 3 points [-]

I'm curious about the "future is an impending train wreck" part. That doesn't seem particularly accurate to me.

Comment author: FormallyknownasRoko 10 December 2010 08:18:52PM *  1 point [-]

Maybe it will all be OK. Maybe the trains fly past each other on separate tracks. We don't know. There sure as hell isn't a driver though. All the inside-view evidence points to bad things,with the exception that Big Worlds could turn out nicely. Or horribly.

Comment author: katydee 10 December 2010 07:28:35PM 2 points [-]

This isn't meant as an insult, but why did it take you 5 years of dedicated effort to learn that?

Comment author: FormallyknownasRoko 10 December 2010 07:32:49PM *  3 points [-]

Specifics. Details. The lesson of science is that details can sometimes change the overall conclusion. Also some amount of nerdyness meaning that the statements about human nature weren't obvious to me.

Comment deleted 10 December 2010 08:03:24PM *  [-]
Comment author: steven0461 10 December 2010 09:50:05PM 6 points [-]

I would choose that knowledge if there was the chance that it wouldn't find out about it. As far as I understand your knowledge of the dangerous truth, it just increases the likelihood of suffering, it doesn't make it guaranteed.

I don't understand your reasoning here -- bad events don't get a "flawless victory" badness bonus for being guaranteed. A 100% chance of something bad isn't much worse than a 90% chance.

Comment author: XiXiDu 11 December 2010 09:07:07AM 0 points [-]

I said that I wouldn't want to know it if a bad outcome was guaranteed. But if it would make a bad outcome possible, but very-very-unlikely to actually occur, then the utility I assign to knowing the truth would outweigh the very unlikely possibility of something bad happening.

Comment author: FormallyknownasRoko 10 December 2010 10:34:52PM 0 points [-]

No, dude, you're wrong

Comment author: FormallyknownasRoko 10 December 2010 08:15:33PM *  4 points [-]

The compelling argument for me is that knowing about bad things is useful to the extent that you can do something about them, and it turns out that people who don't know anything (call them "non-cogniscenti") will probably free-ride their way to any benefits of action on the collective-action-problem that is the at issue here, whilst avoiding drawing any particular attention to themselves ==> avoiding the risks.

Vladimir Nesov doubts this prima facie, i.e. he asks "how do you know that the strategy of being a completely inert player is best?".

-- to which I answer, "if you want to be the first monkey shot into space, then good luck" ;D

Comment author: timtyler 10 December 2010 09:05:40PM *  0 points [-]

it turns out that people who don't know anything (call them "non-cogniscenti") will probably free-ride their way to any benefits of action on the collective-action-problem that is the at issue here, whilst avoiding drawing any particular attention to themselves ==> avoiding the risks.

This is the "collective-action-problem" - where the end of the world arrives - unless a select band of heroic messiahs arrive and transport everyone to heaven...?

That seems like a fantasy story designed to manipulate - I would council not getting sucked in.

Comment author: katydee 10 December 2010 06:47:50PM *  1 point [-]

Ah, you remind me of me from a while back. When I was an elementary schooler, I once replied to someone asking "would you rather be happy or right" with "how can I be happy if I can't be right?" But these days I've moderated somewhat, and I feel that there is indeed knowledge that can be harmful.

Comment author: Manfred 10 December 2010 07:39:06PM 3 points [-]

Really thought you were going to go with Tom Riddle on this one. Perfect line break for it :)

Comment author: timtyler 10 December 2010 06:53:39PM *  1 point [-]

I would add that I wish I had never learned about any of these ideas. In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity;

Hmm. It is tricky to go back, I would imagine.

The material does come with some warnings, I believe. For instance, consider this one:

"Beware lest Friendliness eat your soul." - Eliezer Yudkowsky

Comment author: Vladimir_Nesov 10 December 2010 05:43:29PM 1 point [-]

In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity

As I understand, you donate (and plan to in the future) to existential risk charities, and that is one of the consequences of you having come across that link. How does this compute into net negative, in your estimation, or are you answering a different question?

Comment author: FormallyknownasRoko 10 December 2010 06:01:07PM -1 points [-]

Sure I want to donate. But if you express it as a hypothetical choice between being a person who didn't know about any of this and had no way of finding out, versus what I have now, I choose the former. Though since that is not an available choice, it is a somewhat academic question.

Comment author: XiXiDu 10 December 2010 06:48:51PM *  4 points [-]

But if you express it as a hypothetical choice between being a person who didn't know about any of this and had no way of finding out, versus what I have now, I choose the former.

I can't believe to hear this from a person who wrote about Ugh fields. I can't believe to read a plead for ignorance on a blog devoted to refining rationality. Ignorance is bliss, is that the new motto now?

Comment author: FormallyknownasRoko 10 December 2010 07:00:19PM 1 point [-]

Well look, one has to do cost/benefit calculations, not just blindly surge forward in some kind of post-enlightenment fervor. To me, it seems like there is only one positive term in the equation:: the altrustic value of giving money to some existential risk charity.

All the other terms are negative, at least for me. And unless I actually overcome excuses, akrasia, etc to donate a lot, I think it'll all have been a mutually detrimental waste of time.

Comment author: Vladimir_Nesov 10 December 2010 06:53:42PM *  0 points [-]

There is only one final criterion, the human decision problem. It trumps any other rule, however good or useful.

(You appeal to particular heuristics, using the feeling of indignation as a rhetoric weapon.)

Comment author: Vladimir_Nesov 10 December 2010 06:07:35PM 0 points [-]

Not helping. I was referring to the the moral value of donations as an argument for choosing to know, as opposed to not knowing. You don't seem to address that in your reply (did I miss something?).

Comment author: FormallyknownasRoko 10 December 2010 06:14:51PM 1 point [-]

Oh, I see. Well, I guess it depends upon how much I eventually donate and how much of an incremental difference that makes.

It would certainly be better to just donate, AND to also not know anything about anything dangerous. I'm not even sure that's possible, though. For all we know, just knowing about any of this is enough to land you in a lot of trouble either in the causal future or elsewhere.

Comment author: timtyler 10 December 2010 06:19:41PM 0 points [-]

The only possible reason I can see for why one wouldn't want to spread it is that its negative potential does outweigh its very-very-low-probability (and that only if you accept a long chain of previous beliefs).

I gather doing so would irritate our site's host and moderator.

Comment author: Vladimir_Nesov 09 December 2010 10:43:54PM *  2 points [-]

E.g both me and Nesov have been persuaded (once fully filled in) that this is really nasty stuff and shouldn't be let out.

I wasn't "filled in", and I don't know whether my argument coincides with Eliezer's. I also don't understand why he won't explain his argument, if it's the same as mine, now that content is in the open (but it's consistent with, that is responds to the same reasons as, continuing to remove comments pertaining to the topic of the post, which makes it less of a mystery).

Comment author: FormallyknownasRoko 09 December 2010 10:47:42PM 2 points [-]

But you think that it is not a good thing for this to propagate more?

Comment author: Vladimir_Nesov 09 December 2010 10:51:38PM *  3 points [-]

As a decision on expected utility under logical uncertainty, but extremely low confidence, yes. I can argue that it most certainly won't be a bad thing (which I even attempted in comments to the post itself, my bad), the expectation of it being a bad thing derives from remaining possibility of those arguments failing. As Carl said, "that estimate is unstable in the face of new info" (which refers to his own argument, not necessarily mine).

Comment deleted 10 December 2010 02:39:53PM *  [-]
Comment author: Vladimir_Nesov 10 December 2010 02:47:18PM *  0 points [-]

This analogy makes sense if you assume the conclusion that the argument for the post being a Basilisk is incorrect, but not as an argument for convincing people that it's incorrect. To evaluate whether the argument is correct, you have to study the argument itself, there is no royal road (the conclusion can be studied in other ways, since particular proof can't be demanded).

(See this summary of the structure of the argument.)