Your PS for the newly imported post is out of date- the colored text doesn't seem to have come with it.
These are good reads; I was going to post these links later today, after I had time to write up a summary, but you have saved me the trouble.
Notice that concepts make more sense when you revisit a topic, and note which topics provide keys to many others.
I realized when reading this that I have largely been following this method for computer science. Even without any obvious gears clicking into place, I understand talk about, e.g., binary trees or closures that would have baffled me a year ago.
According to CIA 2008 estimates life expectancy in North Korea is 71.92, which is far higher than global average and even slightly higher than that of EU members Romania and Latvia. HDI for North Korea is only available for 1995 where it was 0.766. Both figures show that life in North Korea is somewhere in the middle of modern world, far better than in real third world, and vastly better than historical average.
It seems to me that North Korea was used as an example for its connotations, not denotations - that's a cheap trick that we should be avoiding.
If you look at NK in conjunction with South Korea, it begins to look a lot worse.
(At the same time, you can look at Kenya relative to Somalia and it is just as unflattering.)
"The Internet" is probably an interesting case study. It has grown from a very small niche product into a "fundamental right" in a relatively short time. One of the things that probably helped this shift is showing people what the internet could do for them - it became useful. This is understandably a difficult point on which to sell FAI.
Now that that surface analogy is over, how about the teleological analogy? In a way, environmentalism assumes the same mantle as FAI - "do it for the children". Environmentalism has plenty of benefits over FAI - it has fuzzier mascots and more eminent problems - Terminators aren't attacking, but more and more species are becoming extinct.
Environmentalism is still of interest here through the subtopic of climate change. Climate change already deals with some of the problems existential risk at large deals with - its veracity is argued, its importance is disputed, and the math is poorly understood. The next generation serves as a nice fuzzy mascot and the danger is of the dramatically helpful ever inexorably closer variety. Each day you don't recycle, the earth is in more danger, &c. (The greater benefit of a creeping death, "zombie" danger may be that it negates the need for a mathematical understanding of the problem. It becomes "obvious" that the danger is real if it gets closer everyday.)
How can you convince people to solve a harder problem once, rather than every problem that crops up?
“Whether and when law is more effective than code is an empirical matter — something to be studied, and considered, not dismissed by banalities spruced up with italics.” - Lawrence Lessig
Interesting in a similar way is the article "How To Make Your Own Luck".
We asked subjects to flip through a news-paper that had photographs in it. All they had to do was count the number of photographs. That's it. Luck wasn't on their minds, just some silly task. They'd go through, and after about three pages, there'd be a massive half-page advert saying, STOP COUNTING. THERE ARE 43 PHOTOGRAPHS IN THIS NEWSPAPER. It was next to a photo, so we knew they were looking at that area. A few pages later, there was another massive advert -- I mean, we're talking big -- that said, STOP COUNTING. TELL THE EXPERIMENTER YOU'VE SEEN THIS AND WIN 150 POUNDS [about $235].
For the most part, the unlucky would just flip past these things. Lucky people would flip through and laugh and say, "There are 43 photos. That's what it says. Do you want me to bother counting?" We'd say, "Yeah, carry on." They'd flip some more and say, "Do I get my 150 pounds?" Most of the unlucky people didn't notice.
"...then there's the idea that rationalists should be able to (a) solve group coordination problems, (b) care a lot about other people and (c) win..."
Why should rationalists necessarily care a lot about other people? If we are to avoid circular altruism and the nefarious effects of other-optimizing, the best amount of caring might be less than "a lot."
Additionally, caring about other people in the sense of seeking emotional gratification primarily in tribe-like social rituals may be truly inimical to dedicating one's life to theoretical physics, math, or any other far-thinking discipline.
Caring about other people may entail involvement in politics, and local politics can be just as mind-killing as national politics.
I don't think that b is necessarily an immediate entailment of rationality, but a condition that can be met simultaneously with a and c. The post presents a situation where c is satisficed only through a and b. (It does not take much finagling to suppose that a lonesome mountain man existence in a world ruled by barbarians is inferior in fuzziness and utilons relative to the expectation of the world where a b and c are held to be true.)
Do, whatever you may find worth doing.
Does anyone know about any Chicago area Singularity-esque groups at which doing might be done? I am interested in volunteering amateur labor in the hopes of progressing toward volunteering professional specialized labor.
Authors whose work reveals a deep enough understanding of their characters that you would say of them, "This goes beyond what I thought a man (woman) could understand of women (men)" are terribly exceeding rare. I'm not sure who the male conjugate of Jacqueline Carey might be.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
There are a fair number of Revolutionary War reenactments - it's a pretty spirited community, from what I've heard. They also seem to evade some of the corniness criticism Renaissance Fairs seem to garner. Chess and go may not count as "fandom", but they are reasonably popular.
I don't think it's the /badness/ that is required to have a fandom, but a constant stream of discussion. Without badness, it's harder to sustain the discussion. If everyone agreed pirates would beat ninjas or that longswords were better than katana, eventually conversation dries up. Badness spurs arguments that allow adherents to share their beliefs and signal their devotion.