robertskmiles comments on Welcome to Less Wrong! (July 2012) - Less Wrong

20 Post author: ciphergoth 18 July 2012 05:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (843)

You are viewing a single comment's thread. Show more comments above.

Comment author: skeptical_lurker 28 July 2012 06:51:25PM *  9 points [-]

Hello everyone, Like many people, I come to this site via an interest in transhumanism, although it seems unlikely to me that FAI implementing CEV can actually be designed before the singularity (I can explain why, and possibly even what could be done instead, but it suddenly occurred to me that it seems presumptuous of me to criticize a theory put forward by very smart people when I only have 1 karma...).

Oddly enough, I am not interested in improving epistemic rationality right now, partially because I am already quite good at it. But more than that, I am trying to switch it off when talking to other people, for the simple reason (and I'm sure this has already been pointed out before) that if you compare three people, one who estimates the probability of an event at 110%, one who estimates it at 90%, and one who compensates for overconfidence bias and estimates it at 65%, the first two will win friends and influence people, while the third will seem indecisive (unless they are talking to other rationalists). I think I am borderline asperger's (again, like many people here) and optimizing social skills probably takes precedence over most other things.

I am currently doing a PhD in "absurdly simplistic computational modeling of the blatantly obvious" which better damn well have some signaling value. In my spare time, to stop my brain turning to mush, among other things I am writing a story which is sort of rationalist, in that some of the characters keep using science effectively even when the world is going crazy and the laws of physics seem to change dependent upon whether you believe in them. On the other hand, some of the characters are (a) heroes/heroines (b) awesomely successful (c) hippies on acid who do not believe in objective reality (not that I am implying that all hippies/people who use lsd are irrational). Maybe the point of the story is that you need more than just rationality? Or that some people are powerful because of rationality, while others have imagination, and that friendship combines their powers in a my little pony like fashion? Or maybe its all just an excuse for pretentious philosophy and psychic battles?

Comment author: robertskmiles 28 July 2012 08:23:15PM 6 points [-]

I am not interested in improving epistemic rationality right now, partially because I am already quite good at it.

But remember that it's not just your own rationality that benefits you.

it seems presumptuous of me to criticize a theory put forward by very smart people when I only have 1 karma

Presume away. Karma doesn't win arguments, arguments win karma.

Comment author: skeptical_lurker 29 July 2012 08:21:02PM 0 points [-]

But remember that it's not just your own rationality that benefits you.

Are you saying that improving epistemic rationality is important because it benefits others as well as myself? This is true, but there are many other forms of self-improvement that would also have knock-on effects that benefit others.

I have actually read most of the relevant sequences, epistemic rationality really isn't low-hanging fruit anymore for me, although I wish I had known about cognitive biases years ago.

Comment author: robertskmiles 30 July 2012 11:18:04AM *  1 point [-]

Are you saying that improving epistemic rationality is important because it benefits others as well as myself?

No, I'm saying that improving the epistemic rationality of others benefits everyone, including yourself. It's not just about improving our own rationality as individuals, it's about trying to improve the rationality of people-in-general - 'raising the sanity waterline'.

Comment author: skeptical_lurker 31 July 2012 01:17:06PM 1 point [-]

Ok, I see what you mean now. Yes, this is often true, but again, I am trying to be less preachy (at least IRL) about rationality - if someone believes in astrology, or faith healing, or reincarnation then: (a) their beliefs probably bring them comfort (b) Trying to persuade them is often like banging my head against a brick wall (c) even the notion that there can be such a thing as a correct fact, independent of subjective mental states is very threatening to some people and I don't want to start pointless arguments

So unless they are acting irrationally in a way which harms other people, or they seem capable of having a sensible discussion, or I am drunk, I tend to leave them be.