linkhyrule5 comments on The genie knows, but doesn't care - Less Wrong

54 Post author: RobbBB 06 September 2013 06:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (515)

You are viewing a single comment's thread. Show more comments above.

Comment author: linkhyrule5 10 September 2013 01:06:36AM 7 points [-]

But at least Eliezer has done several impossible things in the last decade or so,

Name three? If only so I can cite them to Eliezer-is-a-crank people.

Comment author: shminux 10 September 2013 07:13:42AM *  0 points [-]

If only so I can cite them to Eliezer-is-a-crank people.

I advise against doing that. It is unlikely to change anyone's mind.

By impossible feats I mean that a regular person would not be able to reproduce them, except by chance, like winning a lottery, starting Google, founding a successful religion or becoming a President.

He started as a high-school dropout without any formal education and look what he achieved so far, professionally and personally. Look at the organizations he founded and inspired. Look at the high-status experts in various fields (business, comp sci, programming, philosophy, math and physics) who take him seriously (some even give him loads of money). Heck, how many people manage to have multiple simultaneous long-term partners who are all highly intelligent and apparently get along well?

Comment author: Peterdjones 10 September 2013 10:19:48AM *  5 points [-]

He's achieved about what Ayn Rand achieved, and almost everyone thinks she wasa crank.

Comment author: linkhyrule5 10 September 2013 04:52:25PM 3 points [-]

Basically this. As Eliezer himself points out, humans aren't terribly rational on average and our judgements of each others' rationality isn't great either. Large amounts of support implies charisma, not intelligence.

TDT is closer to what I'm looking for, though it's a ... tad long.

Comment author: linkhyrule5 10 September 2013 04:54:17PM *  4 points [-]

I advise against doing that. It is unlikely to change anyone's mind.

Point, but there's also the middle ground "I'm not sure if he's a crank or not, but I'm busy so I won't look unless there's some evidence he's not."

The big two I've come up with is a) he actually changes his mind about important things (though I need to find an actual post I can cite - didn't he reopen the question of the possibility of a hard takeoff, or something?) and b) TDT.

Comment author: Gurkenglas 10 September 2013 03:13:34AM *  0 points [-]

Won some AI box experiments as the AI.

Comment author: linkhyrule5 10 September 2013 05:42:34AM 4 points [-]

Sure, but that's hard to prove: given "Eliezer is a crank," the probability of "Eliezer is lying about his AI-box prowess" is much higher than "Eliezer actually pulled that off."

The latest success by a non-Eliezer person helps, but I'd still like something I can literally cite.

Comment author: private_messaging 10 September 2013 10:35:05PM 1 point [-]

Eliezer is lying about his AI-box prowess

I don't see why anyone would think that. Plenty of people in the anti-vaccination crowd managed to convince parents to mortally endanger their children.

Comment author: linkhyrule5 10 September 2013 10:52:27PM 2 points [-]

Yes, but that's really not that hard. For starters, you can do a better job of picking your targets.

The AI-box experiment often is run with intelligent, rational people with money on the line and an obvious right answer; it's a whole lot more impossible than picking the right uneducated family to sell your snake oil to.

Comment author: private_messaging 10 September 2013 10:58:50PM *  0 points [-]

Ohh, come on. Cyclical reasoning here. You think Yudkowsky is not a crank, so you think the folks that play that silly game with him are intelligent and rational (by the way a plenty of people who get duped by anti-vaxxers are of above average IQ), and so you get more evidence that Yudkowsky is not a crank. Cyclical reasoning doesn't persuade anyone who isn't already a believer.

You need non-cyclical reasoning. Which would generally be something where you aren't the one having to explain people that the achievement in question is profound.

Comment author: linkhyrule5 10 September 2013 11:04:30PM 1 point [-]

You need non-cyclical reasoning. Which would generally be something where you aren't the one having to explain people that the achievement in question is profound.

This bit confuses me.

That aside:

You think Yudkowsky is not a crank, so you think the folks that play that silly game with him are intelligent and rational

Non sequitur. From the posts they make, everyone on this site seems to me to be sufficiently intelligent as to make "selling snake oil" impossible, in a cut-and-dry case like the AI box. Yudowsky's own credibility doesn't enter into it.

Comment author: private_messaging 10 September 2013 11:41:14PM *  1 point [-]

Non sequitur.

I thought you wanted to persuade others.

From the posts they make, everyone on this site seems to me to be sufficiently intelligent as to make "selling snake oil" impossible, in a cut-and-dry case like the AI box.

So what do you think even happened, anyway, if you think the obvious explanation is impossible?

Comment author: linkhyrule5 10 September 2013 11:54:46PM 2 points [-]

I thought you wanted to persuade others.

Yes, but I don't see why this is relevant

So what do you think even happened, anyway, if you think the obvious explanation is impossible?

Ah, sorry. This brand of impossible.

Comment author: private_messaging 11 September 2013 09:02:26AM *  3 points [-]

Yes, but I don't see why this is relevant

Originally, you were hypothesising that the problem with persuading the others would be the possibility that Yudkowsky lied about AI box powers. I pointed out the possibility that this experiment is far less profound than you think it is. (Albeit frankly I do not know why you think it is so profound).

Ah, sorry. This brand of impossible.

What ever is the brand, any "impossibilities" that happen should lower your confidence in the reasoning that deemed them "impossibilities" in the first place. I don't think IQ is so strongly protective against deception, for example, and I do not think that you can assess something based on how the postings look to you with sufficient reliability as to overcome Gaussian priors very far from the mean.

edit: example. I would deem it quite unlikely that Yudkowsky could, for example, score highly on a programming contest with competent participants or in any other conventional, validated, reliable metric of technical expertise and ability, under good contest rules (i.e. excluding the possibility of externals assistance). So if he did something like that, I'd be quite surprised, and lower the confidence in what ever models deemed that impossible; good old Bayes. I'm far more confident in the validity of those conventional metrics (and in lack of alternate modes of passing, such as persuasion) than in my assessment so my assessment would change the most. Meanwhile, when it's some unconventional game, well, even if I thought that this game is difficult, I'd be much less confident in the reasoning "it looks hard so it must be hard" than the low prior of exceptional performance is low.

Comment author: Juno_Watt 12 September 2013 06:53:17AM 0 points [-]

Some folks on this site have accidentally bought unintentional snake oil in The Big Hoo Hah That Shall not Be Mentioned. Only an intelligent person could have bought that particular puppy,

Comment author: linkhyrule5 12 September 2013 07:28:57AM 0 points [-]

Granted. And it may be that additional knowledge/intelligence makes yourself more vulnerable a Gatekeeper.

Comment author: Peterdjones 12 September 2013 08:24:20AM 0 points [-]

Trying to think this out in terms of levels of smartness alone is very unlikely to be helpful.

Comment author: MugaSofer 11 September 2013 05:00:32PM 0 points [-]

plenty of people who get duped by anti-vaxxers are of above average IQ

But less than half of them, I'll wager. This is clearly an abuse of averages.

Comment author: private_messaging 11 September 2013 05:34:30PM *  6 points [-]

I wouldn't wager too much money on that one. http://pediatrics.aappublications.org/content/114/1/187.abstract .

Results. Undervaccinated children tended to be black, to have a younger mother who was not married and did not have a college degree, to live in a household near the poverty level, and to live in a central city. Unvaccinated children tended to be white, to have a mother who was married and had a college degree, to live in a household with an annual income exceeding $75 000, and to have parents who expressed concerns regarding the safety of vaccines and indicated that medical doctors have little influence over vaccination decisions for their children.

And in any case the point is that any correlation between IQ and not being prone to getting duped like this is not perfect enough to deem anything particularly unlikely.

Comment author: MugaSofer 12 September 2013 03:40:32PM *  1 point [-]

Hmm. Yeah, that's hardly conclusive, but I think I was actually failing to update there. Now that you mention it, I seem to recall that both conspiracy theorists and cult victims skew toward higher IQ. I was clearly quite overconfident there.

And in any case the point is that any correlation between IQ and not being prone to getting duped like this is not perfect enough to deem anything particularly unlikely.

Wasn't the point that

intelligent, rational people with money on the line and an obvious right answer

wasn't enough, actually? That seems like a much stronger claim than "it's really hard to fool high-IQ people".

Comment author: Nornagest 11 September 2013 05:45:55PM *  1 point [-]

I imagine that says more about the demographics of the general New Age belief cluster than it does about any special IQ-based appeal of vaccination skepticism.

There probably are some scams or virulent memes that prey on insecurities strongly correlated with high IQ, though. I can't think of anything specific offhand, but the fringes of geek culture are probably one of the better places to start looking.

Comment author: private_messaging 11 September 2013 05:50:28PM *  2 points [-]

Well, the way I see it, outside of very high IQ in combination with education that is multiple topics of biochemistry, effects of intelligence are small and are easily dwarfed by things like those demographical correlations.

There probably are some scams or virulent memes that prey on insecurities specific to high-IQ people, though. I can't think of anything specific offhand

Free energy scams. Hydrinos, cold fusion, magnetic generators, perpetual motion, you name it. edit: or in the medicine, counter intuitive stuff like sitting in an old uranium mine inhaling radon, then having so much radon progeny plate-out it sets nuclear material smuggling alarms off. Naturalistic fallacy stuff in general.

Comment author: Gurkenglas 11 September 2013 07:21:37PM *  0 points [-]

Cryonics. ducks and runs

Edit: It was a joke. Sorryyyyyy

Comment author: shminux 10 September 2013 11:30:27PM 0 points [-]

Cyclical reasoning here.

You probably mean "circular".

Comment author: EHeller 10 September 2013 06:15:02AM 1 point [-]

Also, maybe its a matter of semantics, but winning a game that you created isn't really 'doing the impossible' in the sense I took the phrasing.

Comment author: Luke_A_Somers 30 September 2013 07:54:45PM *  1 point [-]

Winning a game you created... that sounds as impossible to win as that?