as a rationalist-not technophile/libertarian etc but as one who seeks to be more rational, do you seriously believe in what Buddhism preaches? all of it?
The fact that you ask this question is strong evidence you are being careless. You assume stupidity and are self-satisfied. You will never be a strong rationalist this way. You need to cultivate a sense that much more is possible.
if you're going to cherry pick then why call it Buddhism and praise it so?
Did not praise. I know that you know that your assumptions are mostly rhetorical. Still dangerous. Carelessness. Not moving in harmony with the Bayes. Begging for confirmation, is this disposition of assumption. You will be pulled off course by this. These simple skills of rationality must be perfected if one is to build very strong rationality, with very complex skills. Necessary if you are to use all of your cognitive aspects and limitations to achieve all that is possible. Only possible to use limitation of affective thoughts for good after one is very consistently strong rationalist. Must be able to hold a very steady course in mindspace, in conceptspace, in identityspace, before one can try to use powerful attractors like affect to accelerate along that course. Less Wrong folk cannot do this consistently. Almost no one can. Enlightened people, mostly; maybe others from other disciplines that I know not. I cannot yet do so. Perhaps not far, though.
I would greatly appreciate a simple and rational explanation of why I-or Eliezer or anyone else-should take "good" Buddhism seriously in our pursuit of rationality?
Less Wrong is not really worth my time, except as providing a motivation to write. The epistemological gap between Less Wrong and me is growing too wide. Eliezer I may talk to next time he's around, I guess. The epistemological gap between Eliezer and me is growing narrower. Still many levels above me is Eliezer, but I think only 2.2 levels or so. Easily surmountable with recursive self-improvement.
my problem is mainly, the attachment of Buddhist teaching with 'meditation'-which seems to be universal and not only a Buddhist practice-of some value but not more than say studying human bias or generally reading the average LW top post.
We do not live in Gautama's time. Almost all of Theravada is true, but most is not relevant for rationalists of my caliber.
- Virtues that he preached, we mostly have now. Smart people are cultured enough to have these virtues and understand their motivations. Evolutionary psychology and cultivated compassion. So virtue part of Buddhism, not as important, I think.
Community part of Buddhism, sangha, very important; but having a peer group of strong rationalists intent on leveling up, is its own sangha, and one better than any that could have arisen almost anywhere in the past.
Mindfulness, insight, concentration, self-control; these are the third branch of Buddhism, and the part Less Wrong needs. Thus I focus on that part. I know about human bias. I have read nearly every Less Wrong post. But understanding the algorithms behind human bias from the inside, feeling the qualia of cognitive subsystems, building those qualia, being mindful of attractors in mindspace; these are important skills for a rationalist. Knowledge of a bias is a knowledge. Important one, but so very limited. Having the disposition of feeling biases as pulls on cognition, at all levels of complexity of cognitive algorithms: this is much stronger skill. Necessary to become superintelligence. Which is everyone's desire, no? It should be, I think. Would be their desire if they knew more, thought faster, were wiser and more compassionate. Meditation, not only way of building this skill. Just oldest and most studied one. Many have tread this path and passed on knowledge. The skills of good computational cognitive scientist, also strong. But no one writes about these skills. I think becoming superintelligent probably not possible without them. But meditation builds similar skills, and is close. Both are metaskills. Epistemological bootstrapping mechanisms. There are meta-metaskills for this. Well, there aren't yet, but I am building one, and others are building some. We sense that more is possible. Buddhism is silly. Meditation, less silly, but not important of itself. Just one metaskill. Soon more will be possible. Not sure if it will trickle down to Less Wrong. Probably not. The gap is too wide.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I never founded a group myself. But maybe can give a few pointers. First make sure you actually want to do that. Going to other groups as guest is free and easy, and you can check out if the structure of the Organization is something you like. Then you need a few like-minded people. There is a minimum number for new groups. Keep in mind that it is a training group for speaking in general. One does not practice to convince in a specific topic, but the speaking itself - which would classify it as a dark art, and maybe not even be what you look for. For the actual founding the easiest way is to contact another local group and ask for assistance. There is some paperwork, and then holding the regular meetings. Having a few experienced members is a really good way to get a new group going.
rationality might have at some point evolved from the dark arts themselves, i.e. human propensity to make up reasons as they go might have lead to their being better minds at making up reasons and arguments-I read that in an article somewhere but can't remember where exactly.
The dark arts get too much mud slung at them and IMO warrant further study,careful dissection by wise men wearing all the necessary charms and offering appropriate sacrifices should be sufficient ..:)