XiXiDu comments on What makes Less Wrong awesome? - Less Wrong

11 Post author: Will_Newsome 23 May 2011 06:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 23 May 2011 01:58:55PM *  10 points [-]

Where else on the internet are people willing to change their minds?

Many scientists are willing to change their minds. Even normal people change their minds often. People become atheists or start voting for a different party. How many members here can you actually name who changed their mind about something dramatic?

Someone who is rather cynical about Less Wrong could go a step further and conclude that Less Wrong appears to be about changing your mind, but that it mainly attracts people who already tend to agree with ideas put forth on Less Wrong, who take ideas seriously. Everyone else turns his back on it or gets filtered out quickly. And those that already agree are not going to change their mind again, because they are not entitled to the particular proof necessary to change their mind, as most of the controversial ideas are either framed as a prediction or logical implication that is not subject to empirical criticism. What is left over is too vague or unsubstantial to change your mind about it one way or the other.

Comment author: MinibearRex 23 May 2011 10:34:37PM 3 points [-]

When I discovered less wrong, there were things I disagreed with. There are actually still things discussed here that I disagree with the apparent consensus on. But I've changed my mind on a large number of things. When I joined less wrong, my understanding of cryonics was that it was a scam for new-agers. I had heard of such concepts as transhumanism and singularitarianism, but had no exposure to individuals who actually held such beliefs. After reading a few of the sequences, I went to EY's website, and found this. I finished that article, thought about it for approximately a minute, and said "Yep. That makes sense." Fast forward one week, and I'm persuading other people to sign up for cryonics. That was a pretty dramatic shift for me.

Comment author: wallowinmaya 03 July 2011 02:14:54PM 2 points [-]

it mainly attracts people who already tend to agree with ideas put forth on Less Wrong

Hm, in my case I have to say that reading LessWrong changed almost all my beliefs: Roughly 9 months ago I was a socialist, an anti-reductionist, an agnostic leaning towards deism, new-age-minded guy who loved psychedlic drugs and was probably addicted to marijuana. I was proud of my existential angst and read like-minded philosophy and literature. I had no idea of transhumanism, the singularity, existential risks or the FAI-problem.

I don't believe that I'm the only one, who changed his mind after reading the sequences, I'm not that special!

Comment author: XiXiDu 03 July 2011 02:45:07PM *  2 points [-]

That's interesting, I didn't expect that since I thought that most people who could benefit a lot from LW are most likely not going to read it or understand it. But maybe I am wrong, I seldom encounter such stories as yours.

But that you went all the way from new-age to the Singualrity and FAI troubles me a bit. Not that it isn't better than new-age stuff, but can you tell me what exactly convinced you of risks from AI?

Comment author: wallowinmaya 03 July 2011 04:47:55PM *  0 points [-]

Well, to be clear, I didn't believe in homeopathy or astrology or other obviously false crackpot theories. In fact, some of my heroes were skeptics like Bertrand Russell and Richard Dawkins. But I also believed in some objective, transcendental morality stuff, and I tried to combine some mystic,mysterious quantum physics interpretations with Buddhist philosophy ( you know the Atman is the Brahman, etc..) . Just like Schrödinger, Bohm and so on. And I ( wanted to) belief in the sort of free will proposed by Kant. I didn't understand what I was thinking and I had the feeling that something was wrong or inconsistent with my beliefs. When I was younger I were much more confident in materialism and atheism, but some drug-experiences disturbed me and I began to question my worldviews. Anyway, let's say I believed in enlightened, deeply wise-sounding, new-age-gibberish. I know, I know, It's embarrassing, but hopefully not that embarrassing.

Well, some essays of Bostrom and mainly the sequences, in particularly the AI-Foom-debate convinced me of the risks of AI. I'm not as sure about it as e.g. Yudkowsky ( In fact I probably think that it is more likely than not, that his scenario is false) but, if we assign a 25% probability to the Yudkowskian AI-Foom-scenario it still seems absurdly important, right? And Yudkowsky makes more sense to me than Hanson or Goertzel, and folks like Kurzweil, and especially De Garis seem to be off base.

Comment author: Curiouskid 12 November 2011 12:19:15PM 1 point [-]

I am just starting out here, but I feel as if I'm about to change my mind in the same way you did. I was interested in Utopia (ending suffering) and that got me pulled into Buddhism and all the other paraspychological weirdness.

Comment author: PlaidX 23 May 2011 10:24:05PM *  4 points [-]

Someone even more cynical might say that lesswrong only departs from mainstream skeptical scientific consensus in ways that coincidentally line up exactly with the views of eliezer yudkowsky, and that it's basically an echo chamber.

That said, rational thinking is a great ideal, and I think it's awesome that lesswrong even TRIES to live up to it.

Comment author: jsalvatier 24 May 2011 04:45:42AM 1 point [-]

I'm fairly enthusiastic about LW, but I think that

it mainly attracts people who already tend to agree with ideas put forth on Less Wrong, who take ideas seriously. Everyone else turns his back on it or gets filtered out quickly.

has a big effect.