To quote the front page 

> Less Wrong users aim to develop accurate predictive models of the world, and change their mind when they find evidence disconfirming those models, instead of being able to explain anything.

So, by that logic, one interesting metric of the forum quality would be how often what is posted here makes people change their minds. Of course, most of us change their minds almost all the time, but mostly on some mundane topics and in very small amounts, probably too small to pay attention too. But if something comes to mind, feel free to link a thread or two. Depending on the response, we can even try to measure how influential newer posts are vs. older ones.

EDIT: Feel free to mention the Sequence posts, as well, could be a useful benchmark.

EDIT2: Why specifically changing your mind and not just learning something new? Because unlearning is much harder than initial learning, and we, to generalize from one example, tend to forget the unlearned and relapsed into old ways of thinking and doing. (Links welcome). Probably because the patterns etched in the System 1 are not easily erased, and just knowing something intellectually does not remove the old habits. So, successfully unlearning something and internalizing a different view or concept or a way of doing things is indicative of a much more significant impact than "just" learning something for the first time.

New Comment
35 comments, sorted by Click to highlight new comments since:

Digging through history, here's some LW posts that made me change my mind:

Yvain's Are Wireheads Happy?

Nornagest's comment about the breadwinner model

Wei Dai's comment saying uploads aren't necessarily conscious

pragmatist's comment about special relativity

Stuart Armstrong's Mahatma Armstrong: CEVed to death

Stuart Armstrong's Risk aversion does not explain people's betting behaviours

MileyCyrus's comment about tax incidence

pragmatist's comment about special relativity (the linked diagram is now gone, it's an interesting exercise to reconstruct it)

The link to the diagram still works for me. And thanks for the shout-out!

Looks like I was clicking wrong, now it works for me too.

the linked diagram is now gone

The link still works fine for me...

I have changed my mind about lots of things since my introduction to less wrong (I'm much less political for instance), but I can't think of any specific posts right now. Mostly I've changed my mind through discussing things with rationalist friends.

Also, I'm much less social justicy since reading slatestarcodex and Yvain's previous blog.

Edit: Not a less wrong post, but Yvain's meditations on superweapons changed my mind about various social justice feminist thngs.

[-]Shmi120

Yvain's meditations on superweapons changed my mind about various social justice feminist things.

Scott is pretty amazing at getting people to change their minds in a non-pushy way, with almost no advocacy. I wonder if this skill can be taught.

Also, feel free to elaborate on which feminist social justice concepts you retained and which you had to revise, and maybe quote the relevant post, as a bonus.

A basic one: Before I started reading Eliezer I used to think strong AI wouldn't be dangerous by default, and had argued as such online. ie I thought that AI would systematically do bad things only if explicitly programmed to do so. Now I think strong AI would be dangerous by default, and that Friendliness matters.

edit: I think the relevant posts for this were first made on Overcoming Bias but presumably they still count.

I went from ardently two-boxing to ardently one-boxing when I read that you shouldn't envy someone's choices. More general than that, actually; I had a habit of thinking "alas, if only I could choose otherwise!" about aspects of my identity and personality, and reading that post changed my mind on those aspects pretty rapidly.

I used to be a Popperian falsificationist. I changed my mind after reading about Solomonoff induction. I don't remember exactly which post it was, but it made me accept that even if two theories cannot be differentiated by falsifiable predictions, we have good reason to prefer the simpler theory.

In Poppers wider Phil Of Sci it is perfectly possible to criticise a theory for being too complicated. The main point is against justification, positive support, not in favour of falsification as the only possible epistemic procedure.

[-]Shmi10

Interesting. I read all the same posts and they didn't sway me much.

Well, this site doesn't have many posts that say 'this is a modern controversial issue where lots of people have different opinions, but here is my argument for saying x is the correct one', and I imagine that this is as a result of the general distrust of politicised discussions (that is, political discussions are not necessarily stupid, nor does LessWrong prohibit them, but in general they are avoided on LW because they're unproductive).

LW normally writes about higher level reasoning processes that can be generalised across topics. So a better question might be 'what major topics do you now think very differently about since reading LW' and then try to find a control group to compare it to.

I've become a lot less political since reading (I used to strongly identity as one of the sides, but no longer). I used to explicitly think every argument should be given the best defense possible, now I think you need to just update on evidence and maximise expected utility, rather than holding on to a single idea. I've changed my mind about my life goals (from 'be happy' to 'save the world'). I could probably think of more examples. I used to have more doubts about theologians talking sense, now I never worry I've not thought about their arguments long enough, and happily disregard them.

Well, this site doesn't have many posts that say 'this is a modern controversial issue where lots of people have different opinions, but here is my argument for saying x is the correct one'

There are posts on this site about decision theory, quantum mechanics, time, statistics/probability theory, charity, artificial intelligence, meta-ethics and cryonics that all seem to fit this bill. I'm sure I'm missing other topics. I do agree that most of the directly rationality-related posts aren't presenting particularly controversial ideas.

now I think you need to just update on evidence and maximise expected utility, rather than holding on to a single idea. I've changed my mind about my life goals (from 'be happy' to 'save the world').

Is this change in your life goals merely a consequence of updating on evidence and maximizing expected utility? It sounds to me more like a change in your utility function itself.

I didn't hold any beliefs (never mind strong ones) about decision theory, quantum mechanics, time, probability theory, AI or meta-ethics before I came here. I think I disapproved of cryonics as much as the next person, although now I think it looks a lot better. And I imagine I'm typical, as these topics aren't especially controversial in the public sphere, so I don't think many people do.

Is this change in your life goals merely a consequence of updating on evidence and maximizing expected utility? It sounds to me more like a change in your utility function itself.

This looks like a confusion of words. I mean, my utility function didn't actually change - no one performed surgery on my brain. I learned about what was important to me, changed my mind about what I value, from arguments at LW. But this should be seen as me better understanding what I do indeed value (or would do if it were that I could understand myself better or something (cf. CEV)).

I mean, my utility function didn't actually change - no one performed surgery on my brain.

Maybe this is a terminological confusion, but often clearing up terminological confusion matters, especially if it involves terminology that has widespread scientific use.

I use "utility function" in its usual decision-theoretic sense. A utility function is a theoretical entity that is part of a model (the VNM decision model) that can be used to (imperfectly) predict and prescribe a partially rational agent's behavior. On this conception, a utility function is just an encapsulation of an agent's preferences, idealized in certain ways, as revealed by their choice behavior. There's no commitment to the utility function corresponding to some specific psychological entity, like some sort of script in the brain that determines the agent's choice behavior. This seems to be different from the way you're using the phrase, but it's worth pointing out that in economics and decision theory "utility function" is just used in this minimal sense. Utilities and utility functions are not supposed to be psychological causes of behavior; they are merely idealized mathematical formalizations of the behavior. Decision theorists avoid the causal notion of utility (Ken Binmore calls it the "causal utility fallacy") because it makes substantive assumptions about how our brains works, assumptions that have not yet been borne out by psychology.

So on this view, changing one's utility function does not require brain surgery. It happens all the time naturally. If your choice behavior changes enough, then your utility function has changed. Also, it's not clear what it would mean to "discover" some part of your pre-existing utility function that you had not known about before. If your choice behavior prior to this "discovery" was different, then the "discovery" is actually just a change in the utility function, not a realization that you weren't actually adhering to your pre-existing utility function.

Alright - I'm only just starting my decision theory textbook so I'll readily admit I probably used the word incorrectly. But I thought that the point you were making is that LessWrong hadn't changed my thinking on a major topic if it was my utility function that had changed. Under this definition though, of a utility function just being the most compact description of what your acts imply you value, changing my beliefs (in the informal meaning) has caused my utility function to change. LW has changed my thinking on the important topic of life goals, and that's the point I was making.

[-]Shmi30

That seems more like learning something new than changing your mind about something you believed. Of course, there is no changing one's mind without learning, but I'm more interested in cases where you thought about a topic and had what you thought was an informed opinion, and then you read a post and it made you reevaluate everything you thought about the issue.

For example, Yvain's Parable On Obsolete Ideologies apparently had a profound effect on Nisan.

I was in debate club and considered politics important and worth my time. Within months of reading the mind-killer sequence I became so frustrated with that whole ordeal that I quit, and I haven't looked back since.

Other than that, yes, you're right.

What's the purpose of asking specifically about changing an opinion?

Chaning one's mind is a big applause light here (and a name of the upcoming Sequences ebook). So it is nice to know whether we are actually doing it, or just believing that we do.

I don't remember myself changing opinions dramatically after reading the Sequences. I was already non-religious; I believed that truth matters and self-deception is unreliable; I was kinda aware that politics makes otherwise reasonable people say crazy things; I noticed that many intelligent people use their intelligence to justify their sometimes completely crazy beliefs in sophisticated ways instead of checking whether their beliefs actually are correct. My biggest impression from Sequences was the relief of realizing that I am not the only person on this planet who believes this (I'm still not sure how much this was a result of this mindset actually being rare, and how much is my inability to find and notice more such people unless they are very explicit about it).

Maybe I generally don't hold strong opinions, and an update from hypothetical 40% to hypothetical 70% especially if it happens slowly, feels in hindsight like nothing happened. Maybe the paradigm-changing updates (like deconverting from a religion) are actually rare, and these silent updates are the norm. Changing your mind shouldn't be a lost purpose -- But maybe I am just making excuses.

Anyway, if you can't give an example of yourself changing your mind, it seems silly if you have it as an applause light. Unless you suggest that other people should change their minds, to think more like you do.

(For the record: We do have here people who changed their minds after reading LW.)

I can't immediately find it, but a post by lukeprog convinced me to start paying attention to how I dress.

It was a top-level post. The argument that mostly stuck with me was along the lines of: even stipulating that fashion is dumb, a lot of people care about it, and dismissing all of those people offhand is probably not clever.

I used to be adamant that free will was real, without knowing precisely what I meant by that. Between some posts and comments, I became convinced that what I was thinking of as free will was somewhat trivial and that what I might call "correct decisions" was more like what I believed in. But correct decisions might be quite repeatable, certainly they would tend that way. So if the relevant decisions made by my mind are predictable, calculable, and only trivial decisions like "pick a number from 1 to 100" would be the ones that were most difficult to predict, then free will is not such a thing as I had adamantly previously thought.

I think the change came about more through discussions in comments than from posts themselves.

My intuition that uploading is not a form of immortality for me (i am in a body) was and is very strong. To me it is clear that "me," in this body, dies, and whether or not a copy of me is created somewhere else, which would then have its own life it would want to conserve, was irrelevant to me, in this body. That is just because the new being would think it had been me is no reason for the current me, who knows he is going to die in the process of being uploaded, should be fooled.

But discussions in comments about the discontinuity even of my life in my own body leave me open to wonder. Instead of thinking myself a person who doesn't want to die, I think of myself in each instant as a machine which, if asked if it wants to die accesses memories as part of its mechanisms to conclude that it doesn't want to die.

This change came about through comment discussions.

Does learning and implementing mind hacks count as changing one's mind?

Seems to me that if you do so, your mind literally changes. And it does so in a way which is probably much more important to your life than taking a different view of Newomb's Paradox or the Fermi Paradox.

Besides, the line between changing one's opinion and learning something new can be rather blurry. For example, I have learned through this discussion board as well as a couple other blogs that humans can and do have contradictory beliefs and models in their minds. Before learning that I had not given the issue much thought and pretty much assumed that if a person believes X, they believe X, and that's pretty much it.

[-]Shmi10

Does learning and implementing mind hacks count as changing one's mind?

I edited the article to clarify what I mean.

Some things I have changed my mind about as a result of reading lesswrong:

  • The "zombies" section of the sequences convinced me that some form of dualism must be correct.
  • No individual post, but the cumulative effect of reading a lot of posts have made me a lot more sceptical of the validity of the claims made by much of medical science.
  • This post made me a lot more sceptical of spaced repetition.
[-]Shmi30

The "zombies" section of the sequences convinced me that some form of dualism must be correct.

So... it had the opposite effect of what was intended? Interesting. Any specific part that stood out?

I remember being particularly impressed by statements such as

This certainly seems like the inner listener is being caught in the act of listening by whatever part of you writes the internal narrative and flaps your tongue. [emphasis in original]

I started looking through http://lesswrong.com/user/John_Maxwell_IV/liked/ and came across this post. However, although this post had information that caused me to update and changed the way I explain things, "changed your mind" has a bit of a connotation of having believed one explanation really strongly previously and then abandoning it. If you tell me 8679 * 579825 is 5032301175, that's new information to me and I will have learned something, but I haven't necessarily "changed my mind" in the colloquial use of the phrase. Thoughts?

[-]Shmi20

I was looking more for a case where you held something as true, but a post made you doubt it and eventually see that you were sorely mistaken.

Most of the discussion posts I've written have led to me changing my mind, but that's pretty me-specific :P

Near every time someone has given me a book recommendation. For example, I asked a question in an open thread that led to me reading "An introduction to tonal theory," which then proceeded to change my mind.

Discussion of habit formation here has been really useful to me, I'm not sure how much of this is "changing my mind" vs. "learning new stuff." I guess I'm pretty bad at making that distinction in general.

[-][anonymous]00

The post about 'why people don't help others more'. I sort of knew that before, but i was somehow sure that it only happens in certain areas of life, like signing a petition against spring hunting if they have already expressed their opinions about other 'hot' environmental topics. It had been just an isolated, unfortunate but reliable tendency that others had. After i read the post, I wondered, for the first time, if I myself am generous.

I like reading SSC in that though I have to identify myself as weakly feministic (as in, assertive on my own behalf and on behalf of other girls who challenge the archetype of predominantly male environmentalists - I just don't get why people prefer to be fined by men and not women), I don't have to share all feminist ideas without consideration. It feels so liberating (until you go out to real people.)