StanR
StanR has not written any posts yet.

StanR has not written any posts yet.

The general premise in the mind sciences is that there are different selves, somehow coordinated through the cortical midline structures. Plenty of different terms have been used, and hypotheses suggested, but the two "selves" I use for shorthand come from Daniel Gilbert: Socrates and the dog. Socrates is the narrative self, the dog is the experiencing self. If you want something a bit more technical, I suggest the lectures about well-being (lecture 3) here, and to get really technical, this paper on cognitive science exploring the self.
Once I realized that achieving anything, no matter what, required my being rational, I quickly bumped "being rational" to the top of my to-do list.
Voted down because your realization is flawed. Achieving anything does not require you to be rational, as evidenced by this post.
The Master of the Way treats people as straw dogs.
Your strategy of dealing with people is also flawed: does the Master of the Way always defect? If you were a skilled exploiter, you wouldn't give obvious signals that you are an exploiter. Instead, you seem to be signaling "Vote me off the island!" to society, and this community. You may want to reconsider that position.
I don't believe we are, because I know of no evidence of the following:
evolutionarily speaking, a big function of system 2 is to function as a decoy/shield mechanism for keeping ideas out of a person. And increasing a person's skill at system 2 reasoning just increases their resistance to ideas.
Perhaps one or both of us misunderstands the model. Here is a better description of the two.
Originally, I was making a case that attempting to reason was the wrong strategy. Given your interpretation, it looks like pjeby didn't understand I was suggesting that, and then suggested essentially the same thing.
My experience, across various believers (Christian, Jehovah's Witness, New Age woo-de-doo) is... (read more)
pjeby, sorry I wasn't clear, I should have given some context. I am referencing system 1 and 2 as simplified categories of thinking as used by cognitive science, particularly in behavioral economics. Here's Daniel Kahneman discussing them. I'm not sure what you're referring to with decoys and shields, which I'll just leave at that.
To add to my quoted statement, workarounds are incredibly hard, and focusing on reasoning (system 2) about an issue or belief leaves few cycles for receiving and sending social cues and signals. While reasoning, we can pick up those cues and signals, but they'll break our concentration, so we tend to ignore them while reasoning carefully.... (read more)
It's not just nontrivial, it's incredibly hard. Engaging "system 2" reasoning takes a lot of effort, lowering sensitivity to, and acute awareness of, social cues and signals.
The mindset of "let's analyze arguments to find weaknesses," aka Annoynance's "rational paths," is a completely different ballgame than most people are willing to play. Rationalists may opt for that game, but they can't win, and may be reinforcing illogical behavior. Such a rationalist is focused on whether arguments about a particular topic are valid and sound, not the other person's rational development. If the topic is a belief, attempting to reason it out with the person is counterproductive. Making no ground when engaging with people on a topic should be a red flag: "maybe I'm doing the wrong thing."
Does anyone care enough for me to make a post about workarounds? Maybe we can collaborate somehow Adelene, I have a little experience in this area.
I was part of a meetup on "alternative energy" (to see if actual engineers went to the things--I didn't want to date a solar cell) when I got an all-group email from the group founder about an "event" concerning The Secret* and a great opportunity to make money. Turned out it was a "green" multi-level marketing scam he was deep in, and they were combining it with the The Secret. Being naive, at first I said I didn't think the event was appropriate, assuming it might lead to some discussion. He immediately slandered me to the group, but I managed to send out an email detailing his connections to... (read more)
Having had to explain to other sci-fi lovers in the past why using fiction as a counterargument is so silly, I googled to see if people had written about why it's silly. GUESS WHAT I FOUND?
The Logical Fallacy of Generalization from Fictional Evidence, by Eliezer Yudkowsky: http://www.overcomingbias.com/2007/10/fictional-evide.html
Yeah, my jaw dropped when I found that. I'm sure you won't respond to this, as the mass of LW moves on to the most current post, but really? Was this a self-aware joke? Eliezer 2009 is that much less rational than Eliezer 2007?
And I said rational atheism, not atheism.
Granted, I didn't express my thoughts on that clearly. I think there is a fundamental difference between attempting to get someone to agree with your opinions and helping them develop rationality--likely through both similar and different opinions. I think the latter is a better moral play, and it's genuine.
What is the higher priority result for a rational atheist targeting a theist:
I think the biggest "win" of the results is the first. But groups claiming to favor rationality most still get stuck on opinions all the time (cryogenics comes to mind). Groups tend to congregate around... (read more)
So, according to Freedom House, countries with nonviolent revolutions since the late 1990s are improving. There's not a lot of data beforehand. You named the exception: Georgia's gotten a little worse since the overthrow of the "rigged" election there. Look at the data: http://www.freedomhouse.org/template.cfm?page=42&year=2008
I'm willing to admit I might have some Western bias, but I try to catch it. The general consensus does seem to be that the elections were rigged, but I don't know enough to say with much confidence either way.
In my original post, I was referring to the period of actual revolution, not everything since. I know it's not all sunshine and rainbows.... (read more)
I hope to post on that post shortly, after giving it some thought.
Gris, just as bias against violence may be the reason it's hardly ever considered, alternatively, it may not only be a rational position, but a strategically sensible one. Please consider looking at the literature concerning strategic nonviolence. The substantial literature at the Albert Einstein Institute is good for understanding nonviolent strategy and tactics against regimes, and the insights provided translate into courses of action in other conflicts, as well.