jimrandomh comments on Attention Lurkers: Please say hi - Less Wrong

35 Post author: Kevin 16 April 2010 08:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (617)

You are viewing a single comment's thread. Show more comments above.

Comment author: jimrandomh 31 August 2010 10:54:24PM 4 points [-]

I don't think there was ever any good evidence that the thought was dangerous. ... In which case, the thought seems to be more forbidden than dangerous.

If there was any such evidence, it would be in the form of additional details, and sharing it with someone would be worse than punching them in the face. So don't take the lack of publically disclosed evidence as an indication that no evidence exists, because it isn't.

Comment author: wedrifid 02 September 2010 09:05:19PM 4 points [-]

So don't take the lack of publically disclosed evidence as an indication that no evidence exists, because it isn't.

It actually is, in the sense we use the term here.

Comment author: SilasBarta 02 September 2010 09:57:16PM 2 points [-]

Exactly. One must be careful to distinguish between "this is not evidence" and "accounting for this evidence should not leave you with a high posterior".

Comment author: timtyler 31 August 2010 11:20:11PM *  1 point [-]

I think we already had most of the details, many of them in BOLD CAPS for good measure.

But there is the issue of probabilities - of how much it is likely to matter. FWIW, I do not fear thinking the forbidden thought. Indeed, it seems reasonable to expect that people will think similar thoughts more in the future - and that those thoughts will motivate people to act.

Comment author: jimrandomh 01 September 2010 12:10:53AM 1 point [-]

I think we already had most of the details, many of them in BOLD CAPS for good measure.

No, you haven't. The worst of it has never appeared in public, deleted or otherwise.

Comment author: timtyler 01 September 2010 12:14:40PM 1 point [-]

Fine. The thought is evidently forbidden, but merely alleged dangerous.

I see no good reason to call it "dangerous" - in the absence of publicly verifiable evidence on the issue - unless the aim is to scare people without the inconvenience of having to back up the story with evidence.

Comment author: EStokes 01 September 2010 02:37:09PM 0 points [-]

If one backed it up with how exactly it was dangerous, people would be exposed to the danger.

Comment author: timtyler 01 September 2010 02:45:41PM *  5 points [-]

The hypothetical danger. The alleged danger. Note that it was alleged dangerous by someone whose living apparently depends on scaring people about machine intelligence. So: now we have the danger-that-is-too-awful-to-even-think about. And where is the evidence that it is actually dangerous? Oh yes: that was all deleted - to save people from the danger!

Faced with this, it is pretty hard not to be sceptical.

Comment author: EStokes 01 September 2010 02:57:53PM *  3 points [-]

I don't donate to SIAI on a regular basis, but I haven't donated because of being scared of UFAI. I think more about aging and death. So, I'm assuming that UFAI is not why most people donate. Also, this incident seems like a net loss for PR, so it being a strategy for more donations doesn't really seem to make sense. As for the evidence, what'd you'd expect to see in a universe where it was dangerous would be it being deleted.

(Going somewhere, will be back in a couple of hours)

Comment author: homunq 01 September 2010 03:28:29PM *  5 points [-]

I have little doubt that some smart people honestly believe that it's dangerous. The deletions are sufficient evidence of that belief for me. The belief, however, is not sufficient evidence for me of the actual danger, given that I see such danger as implausible on the face of it.

In other words, sure, it gets deleted in the world where it's dangerous, as in the world where people falsely believe it is. Any good Bayesian should consider both possibilities. I happen to think that the latter is more probable.

However, of course I grant that there is some possibility that I'm wrong, so I assign some weight to this alleged danger. The important point is that that is not enough, because the value of free expression and debate weighs on the other side.

Even if I grant "full" weight to the alleged danger, I'm not sure it beats free expression. There are a lot of dangerous ideas - for example, dispensationalist christianity - and, while I'd probably be willing to suppress them if I had the power to do so cleanly, I think any real-world efforts of mine to do so would be a net negative because I'd harm free debate and lower my own credibility while failing to supress the idea. Since the forbidden idea, insofar as I know what it is, seems far more likely to independently occur to various people than something like dispensationalism, while the idea of suppressing it is less likely to do so than in that case, I think that such an argument is even stronger in this case.

Comment author: EStokes 01 September 2010 09:19:42PM 0 points [-]

Well, I figure if people that have been proven rational in the past see something potentially dangerous, it's not proof but it lends it more weight. Basically that the idea of there being something dangerous there should be taken seriously.

Hmm, what I meant was that it being deleted isn't evidence of foul play, since it'd happen in both instances.

I don't see any arguments against except for surface implausibility?

Free expression doesn't trump everything. For example, in the Riddle Theory story, the spread of the riddle would be a bad idea. It might occur to people independently, but they might not take it seriously, at at least the spread will be lessened.

I'm not sure if it turned out for the better, deleting it, because people only wanted to know more after its deletion. But who knows.

Comment author: homunq 01 September 2010 09:53:14PM 4 points [-]

I have several reasons, not just surface implausibility, for believing what I do. There's little point in further discussion until the ground rules are cleared up.

Comment author: timtyler 02 September 2010 08:15:27AM *  0 points [-]

Riddle theory is fiction.

In real life, humans are not truth-proving machines. If confronted with their Godel sentences, they will just shrug - and say "you expect me to do what?"

Fiction isn't evidence. If anything it shows that there is so little real evidence of ideas so harmful that they deserve censorship, that people have to make things up in order to prove their point.

Comment author: timtyler 01 September 2010 03:32:35PM 3 points [-]

Also, this incident seems like a net loss for PR, so it being a strategy for more donations doesn't really seem to make sense.

There are PR upsides: the shephard protects his flock from the unspeakable danger; it makes for good drama and folklaw; there's opportunity for further drama caused by leaks. Also, it shows everyone who's the boss.

A popular motto claims that there is no such thing as bad publicity.

Comment author: EStokes 01 September 2010 09:28:48PM *  1 point [-]

Firstly, if there's an unspeakable danger, surely it'd be best to try and not let others be exposed, so this one's really a question of if it's dangerous, and not an argument in itself. It's only a PR stunt if it's not dangerous, if it's dangerous good PR would merely be a side effect.

The drama was bad IMO. Looks like bad publicity to me.

I discredit the PR stunt idea because I don't think SIAI would've dumb enough to pull something like this as a stunt. If we were being modeled as ones who'd simply go along with a lie- well, there's no way we'd be modeled as such fools. If we were modeled as ones who would look at a lie carefully, a PR stunt wouldn't work anyways.

There's also the fact that people who have read the post and are unaffiliated with the SIAI are taking it seriously. That says something, too.

Comment author: wnoise 01 September 2010 09:51:10PM 2 points [-]

There's also the fact that people who have read the post and are unaffiliated with the SIAI are taking it seriously. That says something, too.

Well, many are only taking it seriously under pain of censorship.

Comment author: jimrandomh 02 September 2010 03:38:26AM *  2 points [-]

Firstly, if there's an unspeakable danger, surely it'd be best to try and not let others be exposed, so this one's really a question of if it's dangerous

Not quite. It's a question of what the probability that it's dangerous is, what the magnitude of the effect is if so, what the cost (including goodwill and credibility) to suppressing it are, and what the cost (including psychological harm to third parties) to not suppressing it is. To make a proper judgement, you must determine all four of these, separately, and perform the expected utility computation (probabiltiy * effect-if-dangerous + effect-if-not-dangerous vs cost). A sufficiently large magnitude of effect is sufficient to outweigh both a small probability and large cost.

That's the problem here. Some people see a small probability, round it off to 0, and see that the effect-if-not-dangerous isn't huge, and conclude that it's ok to talk about it, without computing the expected utility.

I tell you that I have done the computation, and that the utility of hearing, discussing, and allowing discussion of the banned topic are all negative. Furthermore, they are negative by enough orders of magnitude that I believe anyone who concludes otherwise must be either missing a piece of information vital to the computation, or have made an error in their reasoning. They remain negative even if one of the probability or the effect-if-not-dangerous is set to zero. Both missing information and miscalculation are especially likely - the former because information is not readily shared on this topic, and the latter because it is inherently confusing.

Comment author: timtyler 02 September 2010 07:42:59AM 1 point [-]

I discredit the PR stunt idea because I don't think SIAI would've dumb enough to pull something like this as a stunt. If we were being modeled as ones who'd simply go along with a lie- well, there's no way we'd be modeled as such fools. If we were modeled as ones who would look at a lie carefully, a PR stunt wouldn't work anyways.

Well, it doesn't really matter what the people involved were thinking, the issue is whether all the associated drama eventually has a net positive or negative effect. It evidently drives some people away - but may increase engagement and interest among those who remain. I can see how it contributes to the site's mythology and mystique - even if to me it looks more like a car crash that I can't help looking at.

It may not be over yet - we may see more drama around the forbidden topic in the future - with the possibility of leaks, and further transgressions. After all, if this is really such a terrible risk, shouldn't other people be aware of it - so they can avoid thinking about it for themselves?

Comment author: khafra 01 September 2010 04:35:58PM 3 points [-]

I really don't have a handle on the situation, but the censored material has allegedly caused serious and lasting psychological stress to at least one person, and could easily be interpreted as an attempt to get gullible people to donate more to SIAI. I don't see any way out for an administrator of human-level intelligence.

Comment author: timtyler 01 September 2010 07:07:48PM 0 points [-]

AFAICT, the stresses seem to be largely confined to those in the close orbit of the Singularity Institute. Eliezer once said: "Beware lest Friendliness eat your soul". So: perhaps the associated pathology could be christened Singularity Fever - or something.