Psychedelics? Nootropics? I guess they are also a big part connecting lots of those subcultures.
Agreed. I might add them to a future version of this map.
This time around I held off mainly because I was confounded by how to add them; drugs really do pervade so many of these groups, in different variants: psychadelics are strong among the counterculture and New Age culture, nootropics are more popular among rationalists and biohackers/Quantified Self, and both are popular among transhumanists. (See this H+ article for a discussion of psychadelic transhumanists.)
Can I summarise that as saying that CFAR takes account of what we are, while LW generally does not?
Well, I'd say that LW does take account of who we are. They just haven't had the impetus to do so quite as thoroughly as CFAR has. As a result there are aspects of applied rationality, or "rationality for humans" as I sometimes call it, that CFAR has developed and LW hasn't.
At $3900, it's an investment, but a low-risk one, since we have a money-back guarantee. If you don't feel like what you got out of it was worth it, we'll refund your money without hesitation or complaint.
This is still not low-risk. I would hesitate to ask for a refund even if an event like this was below my expectations, as long as it's not a total flop or a con, which it surely isn't. Low-risk (for the participant) would be dividing the camp into billable events with a price tag on each, and refunding a portion of the price of each event based on the post-event evaluations. This is probably unworkable in practice, but at least it would not be misleading. On the other hand, "full refund no questions asked" is a useful marketing strategy, if a bit dark-artsy.
If it makes you feel less hesitant, we've given refunds twice. One person at a workshop last year who said he'd expected polish and suits, and another who said he enjoyed it but wasn't sure it was going to help enough with his current life situation to be worth it.
I also made a simplified map of some of our classes, so you can see how I think of them fitting into the bigger picture of rationality (click to enlarge):
I've clicked everything I can think of and it's not enlarging.
Fixed now, sorry!
Hi, the "apply here" link is not working for me.
Thanks!
Fixed! Thanks, I apparently didn't understand how links worked in this system.
I'm a little skeptical about this: "Attendees will be surrounded by other ambitious, successful, practically-minded folks"
Can I get some evidence?
Not sure what kind of evidence you're looking for here; that's just a description of our selection criteria for attendees.
Presumably that depends on how we came to think we held that moral theory in the first place.
If I assert moral theory X because it does the best job of reflecting my moral intuitions, for example, then when I discover that my moral intuitions in a particular case contradict X, it makes sense to amend X to better reflect my moral intuitions.
That said, I certainly agree that if I assert X for some reason unrelated to my moral intuitions, then modifying X based on my moral intuitions is a very questionable move.
It sounds like you're presuming that the latter is generally the case when people assert utilitarianism?
Preferring utilitarianism is a moral intuition, just like preferring Life Extension. The former's a general intuition, the latter's an intuition about a specific case.
So it's not a priori clear which intuition to modify (general or specific) when the two conflict.
Well if you view moral theories as if they were scientific hypothesis, you could reason in the following way: If a moral theory/hypothesis makes a counter intuitive prediction you could 1) reject the your intuition or 2) reject the hypothesis ("I want to") 3) revise your hypothesis.
It would be practical if one could actually try out an moral theory, but I don't see how one could go about doing that. . .
Right -- I don't claim any of my moral intuitions to be true or correct; I'm an error theorist, when it comes down to it.
But I do want my intuitions to be consistent with each other. So if I have the intuition that utility is the only thing I value for its own sake, and I have the intuition that Life Extension is better than Replacement, then something's gotta give.
Here's his 2008 paper, "Life Extension versus Replacement," which explores an amendment to utilitarianism that would allow us to prefer Life Extension
I feel like the thing that should allow us to prefer life extension is the thing that makes people search for amendments to utilitarianism that would allow us to prefer life extension.
When our intuitions in a particular case contradict the moral theory we thought we held, we need some justification for amending the moral theory other than "I want to."
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Another meme that arguably reached the Bay Area via the 1960s/1970s counterculture, but predates it, is "intentional community". This influences startup culture and hacker culture (specifically hackerspaces), and to some (lesser?) extent seasteaders, the back-to-the-land movement, and rationalists as well.
Great one, thanks!