siodine comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: siodine 12 May 2012 01:52:51AM 1 point [-]

Right. Exercise the neglected virtue of scholarship and all that.

It's not that easy to dismiss; if it's as poorly leveraged as it looks relative to other approaches then you have little reason to be spreading and teaching SI's brand of specialized rationality (except for perhaps income).

Comment author: lukeprog 12 May 2012 01:55:17AM 3 points [-]

I'm not dismissing it, I'm endorsing it and agreeing with you that it has been my approach ever since my first post on LW.

Comment author: siodine 12 May 2012 02:10:31AM -1 points [-]

Weird, I have this perception of SI being heavily invested in overcoming biases and epistemic rationality training to the detriment of relevant domain specific knowledge, but I guess that's wrong?

Comment author: lukeprog 12 May 2012 02:25:08AM 2 points [-]

I'm lost again; I don't know what you're saying.

Comment author: siodine 12 May 2012 03:58:27PM *  0 points [-]

I'm not dismissing it, I'm endorsing it and agreeing with you that it has been my approach ever since my first post on LW.

I wasn't talking about you; I was talking about SI's approach in spreading and training rationality. You(SI) have Yudkowsky writing books, you have rationality minicamps, you have lesswrong, you and others are writing rationality articles and researching the rationality literature, and so on.

That kind of rationality training, research, and message looks poorly leveraged in achieving your goals, is what I'm saying. Poorly leveraged for anyone trying to achieve goals. And at its most abstract, that's what rationality is, right? Achieving your goals.

So, I don't care if your approach was to acquire as much relevant knowledge as possible before dabbling in debiasing, bayes, and whatnot (i.e., prioritizing the most leveraged approach). I wondering why your approach doesn't seem to be SI's approach. I'm wondering why SI doesn't prioritize rationality training, research, and message by whatever is the most leveraged in achieving SI's goals. I'm wondering why SI doesn't spread the virtue of scholarship to the detriment of training debiasing and so on.

SI wants to raise the sanity waterline, is what the SI doing even near optimal for that? Knowing what SIers knew and trained for couldn't even get them to see an opportunity for trading in on opportunity cost for years; that is sad.