Comment author: Viliam 31 March 2016 09:19:38AM *  11 points [-]

Well, this is exactly the problem. First step is that people say random things just because they reflect the emotions they have at the moment. Second step is that later they sometimes derive logical consequences of what they previously said. Then bad things happen as a result.

Usually there is a boundary; people often have crazy beliefs in far mode, while having quite sensible beliefs in near mode. They can keep talking bullshit as long as it does not concern them directly, but when it becomes personal they can either conveniently forget to apply the bullshit, or have some general excuse such as "but this specific case is different". This can work surprisingly well as long as there is a social norm of not requiring people to actually act on their abstract beliefs.

Nerds usually lack the social skills to follow this strategy (because no one tells them about it explicitly; because applying this strategy to itself means never talking explicitly about it), usually harming themselves as the result, by following the norms that everyone applauds but no one except a few nerds actually follows. Sometimes they harm the others as the result, for example when they take the norm of killing unbelievers literally and become suicidal bombers; while for most of the society the same words in the holy scripture simply mean "boo unbelievers" without any impact on their everyday life.

And then there is the complication that our civilization became too complex and has too many channels where the far-mode beliefs translate into actions without people noticing that the boundary was crossed. For example, a person holding a stupid belief in far mode could never directly act upon the belief, but they might still vote according to the belief -- and then the politician in the office might actually do it.

It is a useful tool of propaganda to channel these emotions into far-mode beliefs that benefit some specific group. For example, any kind of frustration with various failures in coordination problems (what SSC readers would call "Moloch") can be channeled into hate against "Jews" or "capitalism" or "decadent western civilization", which can in turn influence the political orientation of people.

Comment author: Lamp 01 April 2016 02:14:58AM 0 points [-]

For example, any kind of frustration with various failures in coordination problems (what SSC readers would call "Moloch") can be channeled into hate against "Jews" or "capitalism" or "decadent western civilization"

Well "decadence" is not a bad description of some aspects of Gnon, or "Moloch" if you prefer.

Comment author: Lumifer 31 March 2016 03:03:31PM 5 points [-]

Yep, that's a problem and an additional problem is that this mechanism is often exploited by agitprop and, generally, dark arts at the social scale.

Nerds usually lack the social skills to follow this strategy

I don't think it has anything to do with nerds or social skills. If I had to come up with an expression for what prevents people from applying their abstract beliefs in practice, it would be the trite "common sense" (which, yes, I know, isn't exactly common).

Essentially it's the matter of being able to recognize consequences when they appear in front of your eyes. Most people, thankfully, require a large amount of pushing and shoving to make the transition from "Ethnicity X is bad" to "We will go and set fire to our neighbour's house and throw stones at his children". It's not that avoiding that transition is a social skill, it's more that watching a house burn and children cry has direct emotional impact that you need very high levels of ideological belief to override.

Comment author: Lamp 01 April 2016 02:11:25AM -1 points [-]

Most people, thankfully, require a large amount of pushing and shoving to make the transition from "Ethnicity X is bad" to "We will go and set fire to our neighbour's house and throw stones at his children".

I believe you meant "most middle to upper class Westerners".

Comment author: AlwaysUnite 31 March 2016 05:30:34AM -1 points [-]

Fair criticism, indeed the list has been inclusive on some of the more philosophical ideas. Obviously I hold that some scientific ideas could be mistaken. However "alternative" medicine cannot be established using the scientific method, how is that wrong to include as irrational? Out of a list of 1229 ideas that is probably one of the most definite nonsense ideas included.

Comment author: Lamp 31 March 2016 05:39:56AM 3 points [-]

However "alternative" medicine cannot be established using the scientific method,

Care to explain what you mean by that assertion. You might want to start by defining what you mean by "alternative medicine".

Comment author: Lumifer 31 March 2016 03:18:19AM 0 points [-]

since they don't have an a priori reason to distrust them

Um, they don't..? As in, entirely unfamiliar with Chairman Mao and his ways?

such passion means the Maoist-types are expressing their "true desires"

That assertion strongly smells of straw.

Comment author: Lamp 31 March 2016 05:31:07AM *  0 points [-]

Um, they don't..? As in, entirely unfamiliar with Chairman Mao and his ways?

I specifically said, Maoist-type. Also, during the period in question there was so little information coming out of China that "Mao is a good leader building a glorious utopia" was a plausible theory, especially if one was already pre-inclined to rationalize one's observations that way.

That assertion strongly smells of straw.

Hey, I'm trying to explain my understanding of why leftists do what they do. Besides, it's less straw than "they're just power hungry psychopaths" which seems to be your prefered explanation.

Comment author: Lumifer 31 March 2016 02:08:55AM 0 points [-]

And why would they do that?

Comment author: Lamp 31 March 2016 02:15:22AM *  1 point [-]

They notice how passionate the Maoists are about their beliefs and since they don't have an a priori reason to distrust them, conclude that such passion means the Maoist-types are expressing their "true desires" which are by definition right. Besides the Maoist-types talk about their commitment to justice, and we're all for justice, right?

Comment author: skeptical_lurker 30 March 2016 06:24:12AM *  3 points [-]

The first obvious point is that when learning human values you need a large dataset which isn't biased by going viral on 4chan.

The more interesting question is what happens when we get more powerful AI which isn't just a chatbot. Suppose in the future a powerful Baysian inference engine is developed. Its not an AGI, so there is no imminent singularity, but it does have the advantages of very large datasets and being completely unbiased. Asking it questions produces provably reliable results in many fields (but it is not smart enough to answer "how do I create AGI?"). Now, there are a lot of controversial beliefs in the world, so I would say it is probable that it answers at least one question in a controversial way, whether this is "there is no God" or "there are racial differences in intelligence" or even "I have ranked all religions, politics and philosophies in order of plausibility. Yours come near the bottom. I would say I'm sorry, but I am not capable of emotions.".

How do people react? Since its not subject to emotional biases, it's likely to be correct on highly controversial subjects. Do people actually change their minds and believe it? After the debacle, Microsoft hardcoded Tay to be a feminist. What happens if you apply this approach to the Baysian inference engine? Well, if there is logic like so:

The scientific method is reliable -> very_controversial_thing

And hardcoded:

P(very_controversial_thing)=0

Then the conclusion is that the scientific method isn't reliable.

I the point I am trying to make is that if an AI axiomatically believes something which is actually false, then this is likely to result in weird behaviour.

As a final thought, for what value of P(Hitler did nothing wrong) does the public start to freak out? Any non-zero ammount? But 0 and 1 are not probabilities!

Comment author: Lamp 31 March 2016 02:08:50AM *  0 points [-]

The scientific method is reliable -> verycontroversialthing

And hardcoded:

P(verycontroversialthing)=0

Then the conclusion is that the scientific method isn't reliable.

I the point I am trying to make is that if an AI axiomatically believes something which is actually false, then this is likely to result in weird behaviour.

I suspect it would react by adjusting it's definitions so that verycontroversialthing doesn't mean what the designers think it means.

This can lead to very bad outcomes. For example, if the AI is hard coded with P("there are differences between human groups in intelligence")=0, it might conclude that some or all of the groups aren't in fact "human". Consider the results if it is also programed to care about "human" preferences.

Comment author: skeptical_lurker 30 March 2016 06:24:37AM 0 points [-]

They would perhaps conclude that an AI has no soul?

Comment author: Lamp 31 March 2016 01:59:13AM 0 points [-]

Probably, that seems it be their analogue of concluding Tay is "Nazi".

Comment author: AlwaysUnite 30 March 2016 09:28:25PM 1 point [-]

There are several conspiracy theories about the airport actually. Apparently there are storage bunkers below the main buildings used for "unsavory business". The MKULTRA-Jonestown conspiracy theory says that MKULTRA created the Jonestown cult if I remember correctly :)

Actually I am a bit surprised, the post got two downvotes already. I was under the impression that LW would appreciate it given it being a site about rationality and all.. I've been reading LW for quite some time but I hadn't actually posted before, did I do something horribly wrong or anything?

Comment author: Lamp 31 March 2016 01:35:00AM *  5 points [-]

Actually I am a bit surprised, the post got two downvotes already. I was under the impression that LW would appreciate it given it being a site about rationality and all.. I've been reading LW for quite some time but I hadn't actually posted before, did I do something horribly wrong or anything?

This list falls into a common failure mode among "skeptics" attempting to make a collection of "irrational nonsense". Namely, having no theory of what it means for something to be "irrational nonsense" so falling back on a combination of absurdity heuristic and the belief's social status.

It doesn't help that many of your labels for the "irrational nonsense" are vague enough that they could cover a number of ideas many of which are in fact correct.

Edit: In some cases I suspect you yourself don't know what they're supposed to mean. For example, you list "alternative medicine". What do you mean by this. The most literal interpretation is that you mean that all medical theories other than the current "consensus of the medical community" (if such a thing exists) are "irrational nonsense". Obviously you don't believe the current medical consensus is 100% correct. You probably mean something closer to "the irrational parts of alternative medicine are irrational", this is tautologically true and useless. Incidentally it is also true (and useless) that the irrational parts of the current "medical consensus" are irrational.

Comment author: Lumifer 30 March 2016 02:49:55PM *  8 points [-]

'men of good will can always come to a reasonable agreement' article of faith

That's an interesting observation given that SJWs are very... forceful about separating everyone into sheep and goats. They have come to heavily depend upon the existence of "the enemy" fighting which constitutes most of their raison d'etre. There are, of course, parallels with the devil, but the machinations of Satan figure much more prominently in Catholicism and are (almost?) completely absent in the UU doctrine.

this is an ancient value with regards to my family/my friends/my tribe... But I'm doubtful about how far beyond that it would extend.

For low-cost help I think pretty far. Imagine yourself travelling in some non-Christian country where you are clearly not a native (say, China for most people here). You had a minor accident and you are standing at the side of the road over, say, a broken bike and bleeding from a minor gash. You think random strangers won't stop and help you?

I do also think a lot of Social Justice thinking started out as a genuine desire to help people and make them happier

I've come to realize that my view of SJ is insufficiently steely. Do you happen to know of some text that presents SJ in a reasonable way and:

  • Is not written by an idiot
  • Is not written for idiots
  • Handwaves as little as possible
  • Does not descend into the post-modernism morass
  • Does not reduce SJ to humanism and/or XIX-century liberalism writ large

?

Comment author: Lamp 30 March 2016 11:51:11PM 2 points [-]

'men of good will can always come to a reasonable agreement' article of faith

That's an interesting observation given that SJWs are very... forceful about separating everyone into sheep and goats.

I believe the logic there is "he's not coming to an agreement with me, therefore he can't be a man of good will".

completely absent in the UU doctrine.

The UU are (one of) the intelectual descendants (and frequently biological descendants) of the Puritains. The latter had a strong emphasis on both Satan and only the "elect" being saved.

Comment author: kilobug 30 March 2016 03:49:03PM 2 points [-]

First, "Social justice" is a broad and very diverse movement of people wanting to reduce the amount of (real or perceived) injustice people face for a variety of reasons (skin color, gender, sexual orientation, place of birth, economical position, disability, ...). Like in any such broad political movement, subparts of the movement are less rational than others.

Overall, "social justice" is still mostly a force of reason and rationality against the most frequent and pervasive forms of irrationality in society, which are mostly religion-based, but yes it varies from subparts of the movement. It is, historically, a byproduct of the Enlightenment after all.

That said, there are several levels of "rationality" and "rationalism", and it might be very rational to make irrational demands.

When you make demands in social and political context, you know your demands will usually not be completely fulfilled. Asking for something "impossible" may be the best way, from a game theoretical point of view, to end up with having something not too far from what you really want - the same way that when you're bargaining the price of an item in an informal market (like in latam or maghreb).

It can also be a powerful way to make people think about a question in novel ways and try to find alternative solutions which aren't part of the hypothesis space they usually wander. "Abolish prisons" may seem an irrational demand, and it's very likely that something "like prison" will be required for a few very dangerous individuals, but it can make people think about possible alternatives to prison, something they don't usually do, and which could very well be used for 90% or even perhaps 99% of people currently in prison.

Of course, making "irrational" demands can also be counterproductive, it can discredit the movement, may you appear to be a lunatic, ... but it's a powerful tool to have in your toolbox when you rationally pursue some deep changes in society.

Comment author: Lamp 30 March 2016 11:36:19PM 1 point [-]

"Abolish prisons" may seem an irrational demand, and it's very likely that something "like prison" will be required for a few very dangerous individuals, but it can make people think about possible alternatives to prison, something they don't usually do, and which could very well be used for 90% or even perhaps 99% of people currently in prison.

There are in fact alternatives, for example we could replace prisons with corpral and capital punishment. However, somehow I suspect most of the people arguing for "abolishing prisons" would like this alternative even less.

View more: Prev | Next