All of StanR's Comments + Replies

StanR10

I hope to post on that post shortly, after giving it some thought.

Gris, just as bias against violence may be the reason it's hardly ever considered, alternatively, it may not only be a rational position, but a strategically sensible one. Please consider looking at the literature concerning strategic nonviolence. The substantial literature at the Albert Einstein Institute is good for understanding nonviolent strategy and tactics against regimes, and the insights provided translate into courses of action in other conflicts, as well.

StanR20

The general premise in the mind sciences is that there are different selves, somehow coordinated through the cortical midline structures. Plenty of different terms have been used, and hypotheses suggested, but the two "selves" I use for shorthand come from Daniel Gilbert: Socrates and the dog. Socrates is the narrative self, the dog is the experiencing self. If you want something a bit more technical, I suggest the lectures about well-being (lecture 3) here, and to get really technical, this paper on cognitive science exploring the self.

StanR10

Once I realized that achieving anything, no matter what, required my being rational, I quickly bumped "being rational" to the top of my to-do list.

Voted down because your realization is flawed. Achieving anything does not require you to be rational, as evidenced by this post.

The Master of the Way treats people as straw dogs.

Your strategy of dealing with people is also flawed: does the Master of the Way always defect? If you were a skilled exploiter, you wouldn't give obvious signals that you are an exploiter. Instead, you seem to be sig... (read more)

-1Annoyance
Wanting to accomplish thing X, and being able to expect it to occur as a result of actions I take, requires rationality. Your objection is incorrect. Your understanding of my strategy is incorrect, as evidenced by your question.
StanR30

I don't believe we are, because I know of no evidence of the following:

evolutionarily speaking, a big function of system 2 is to function as a decoy/shield mechanism for keeping ideas out of a person. And increasing a person's skill at system 2 reasoning just increases their resistance to ideas.

Perhaps one or both of us misunderstands the model. Here is a better description of the two.

Originally, I was making a case that attempting to reason was the wrong strategy. Given your interpretation, it looks like pjeby didn't understand I was suggesting that... (read more)

0AdeleneDawner
This. Exactly this. YES.
StanR10

pjeby, sorry I wasn't clear, I should have given some context. I am referencing system 1 and 2 as simplified categories of thinking as used by cognitive science, particularly in behavioral economics. Here's Daniel Kahneman discussing them. I'm not sure what you're referring to with decoys and shields, which I'll just leave at that.

To add to my quoted statement, workarounds are incredibly hard, and focusing on reasoning (system 2) about an issue or belief leaves few cycles for receiving and sending social cues and signals. While reasoning, we can pick up... (read more)

0pjeby
My hypothesis is that reasoning as we know it evolved as a mechanism to both persuade others, and to defend against being persuaded by others. Consider priming, which works as long as you're not aware of it and therefore defending against it. But it makes no sense to evolve a mechanism to avoid being primed, unless the priming mechanism were being exploited by our tribe-mates. (After all, they're the only ones besides us with the language skill to trigger it.) In other words, once we evolved language, we became more gullible, because we were now verbally suggestible. This would then have resulted in an arms race of intelligence to both persuade, and defend against persuasion, with tribal status and resources as the prize. And once we evolved to the point of being able to defend ourselves against any belief-change we're determined to avoid, the prize would've become being able to convince neutral bystanders who didn't already have something at stake. The system 1/2 distinctions cataloged by Stanovich & West don't quite match my own observation, in that I consider any abstract processing to be system 2, whether it's good reasoning or fallacious, and whether it's cached or a work-in-progress. (Cached S2 reasoning isn't demanding of brainpower, and in fact can be easily parroted back in many forms once an appropriate argument has been heard, without the user ever needing to figure it out for themselves.) In my view, the primary functional purpose of human of reasoning is to persuade or prevent persuasion, with other uses being an extra bonus. So in this view, using system 2 for truly rational thought is actually an abuse of the system... which would explain why it's so demanding of cognitive capacity, compared to using it as a generator of confabulation and rhetoric. And it also explains why it requires so much learning to use properly: it's not what the hardware was put there for. The S&W model is IMO a bit biased by the desire to find "normative" reasoning (i.e.
0jimrandomh
I feel I should jump in here, as you appear to be talking past each other. There is no confusion in the system 1/system 2 distinction; you're both using the same definition, but the bit about decoys and shields was actually the core of PJ's post, and of the difference between your positions. PJ holds that to change someone's mind you must focus on their S1 response, because if they engage S2, it will just rationalize and confabulate to defend whatever position their S1 holds. Now, I have no idea how one would go about altering the S1 response of someone who didn't want their response altered, but I do know that many people respond very badly to rational arguments that go against their intuition, increasing their own irrationality as much as necessary to avoid admitting their mistake.
StanR30

It's not just nontrivial, it's incredibly hard. Engaging "system 2" reasoning takes a lot of effort, lowering sensitivity to, and acute awareness of, social cues and signals.

The mindset of "let's analyze arguments to find weaknesses," aka Annoynance's "rational paths," is a completely different ballgame than most people are willing to play. Rationalists may opt for that game, but they can't win, and may be reinforcing illogical behavior. Such a rationalist is focused on whether arguments about a particular topic are valid a... (read more)

0pjeby
Engaging system 2 is precisely what you don't want to do, since evolutionarily speaking, a big function of system 2 is to function as a decoy/shield mechanism for keeping ideas out of a person. And increasing a person's skill at system 2 reasoning just increases their resistance to ideas. To actually change attitudes and beliefs requires the engagement of system 1. Otherwise, even if you convince someone that something is logical, they'll stick with their emotional belief and just avoid you so they don't have to deal with the cognitive dissonance. (Note that this principle also applies to changing your own beliefs and attitudes - it's not your logical mind that needs convincing. See Eliezer's story about overcoming a fear of lurking serial killers for an example of mapping System 2 thinking to System 1 thinking to change an emotional-level belief.)
StanR70

I was part of a meetup on "alternative energy" (to see if actual engineers went to the things--I didn't want to date a solar cell) when I got an all-group email from the group founder about an "event" concerning The Secret* and a great opportunity to make money. Turned out it was a "green" multi-level marketing scam he was deep in, and they were combining it with the The Secret. Being naive, at first I said I didn't think the event was appropriate, assuming it might lead to some discussion. He immediately slandered me to th... (read more)

4Nick_Tarleton
It's not evidence, but it is a pointer to an argument from existing knowledge: "you know, X probably would actually result from Y". (Well, to a bounded rationalist that is an example of evidence, but a kind that it's not nearly as problematic to get from fiction.)
StanR10

And I said rational atheism, not atheism.

Granted, I didn't express my thoughts on that clearly. I think there is a fundamental difference between attempting to get someone to agree with your opinions and helping them develop rationality--likely through both similar and different opinions. I think the latter is a better moral play, and it's genuine.

What is the higher priority result for a rational atheist targeting a theist:

  • a more rational theist
  • a not-any-more-rational-than-before atheist
  • an irrational agnostic

I think the biggest "win" of t... (read more)

StanR20

So, according to Freedom House, countries with nonviolent revolutions since the late 1990s are improving. There's not a lot of data beforehand. You named the exception: Georgia's gotten a little worse since the overthrow of the "rigged" election there. Look at the data: http://www.freedomhouse.org/template.cfm?page=42&year=2008

I'm willing to admit I might have some Western bias, but I try to catch it. The general consensus does seem to be that the elections were rigged, but I don't know enough to say with much confidence either way.

In my... (read more)

StanR-20

dclayh, Yes, that came to mind for me too. The small-town Gandhian libertarianism of Russell's story is entertaining, and just as silly. Yet, you didn't receive any karma points, and Eliezer received several, so either someone out there thinks a fictional short story is a reasonable rebuttal, or people are scoring for support of a side or entertainment.

Eliezer, I don't see how Russell or Turtledove even belong as anything more than footnotes, unless the discussion is about fiction writers creating alternate universe just-so stories that tend to align wit... (read more)

4prase
It is far from clear that the "colour revolutions" resulted in more democracy in respective countries. See e.g. http://en.wikipedia.org/wiki/Saakashvili#Criticism To be an ally of the West is not the same as to be a democratic country. Similarly, the elections are not automatically rigged if communists win.
1Eliezer Yudkowsky
http://en.wikipedia.org/wiki/The_Last_Article
0[anonymous]
I think E. was trying to swat the old idea "smart people lose to unreasonable people" specifically in war, and perhaps also metaphorically in general competition.
StanR00

I think your secondary purpose is actually the primary purpose, excluding sponsors, who I agree, usually set up the debate for entertainment.

Even if both sides claim that changing minds is the purpose, the actions show otherwise. The "change minds" or "reveal the truth" is a convenient lie, and one that's actually believed. Plus, it would be tacky and uncivilized to state the real reason for the debate, best to claim a more noble imperative--and believe it.

Depending on how polarized the sides are, the audience is either mostly, or com... (read more)

0PhilGoetz
Like I said, entertainment. Theory 2 proposes a way that rallying the base spreads atheism.
StanR20

Absolutely, linking really improves the resource.

A link for each major claim or background topic would be much appreciated. Sometimes I wonder if there shouldn't be an original post layer, also containing the comments, and a wiki-ish layer, that could provide more links and notations. That way, an entrant could dig deeper, and regulars could participate in bridging those gaps, but could also continue in the original post layer without the wiki-ish clutter.

Learning through participation is a problem when a post generates 30+ comments, some of which are as... (read more)