There will be a Correct Contrarian panel at next year's Galactic Crackpot Conference in Schenectady, you're very welcome to attend.
I read some of your comment history, and usually your sarcastic comments are actually funny and contain implicit but substantial criticisms. Sometimes they suck. As far as I can tell, although I'm subject to obvious bias, this one sucks: it's not really funny and it's not clear that there's a substantial criticism. I'm interested in what your criticism actually is, if there is one.
I loved your post, but I also think metatroll's comment is funny. I don't think it's anything but a joke.
(the following isn't off-topic, I promise:)
Attention, people who have a lot of free time and want to found the next reddit:
When a site user upvotes and downvotes things, you use that data to categorize that user's preferences (you'll be doing a very sparse SVD sort of operation under the hood). Their subsequent votes can be decomposed into expressions of the most common preference vectors, and their browsing can then be sorted by decomposed-votes-with-personalized-weightings.
This will make you a lot of friends (people who want to read ramblings about philosophy won't be inundated with cute kitten pictures and vice versa, even if they use the same site), make you a lot of money (better-targeted advertising pays better), solve the problem above (people who like and people who hate trollish jokes won't need to come to a consensus), and solve the problem way above ("predisposition towards rationalism" will probably be one of the top ten or twenty principal components to fall out of your SVD).
It will also create new problems (how much easier will it be to hide in a bubble of people who share your political opinions? how do you filter out redundancy?) but those can be fixed in subsequent steps.
For now it's just embarrassing that modern forums don't have either the same level of fine-grained preferences that you could find on Slashdot 15 years ago ("Funny" vs "Informative" etc) or the killfile capabilities you could find in Usenet readers 25 years ago.
Actually, the prophet tells us to 'notice when we are surprised'. There are those who say that funnyness has a lot to do with being surprised.
Prediction: English Omnipresent Humour (we make jokes at funerals, for God's sake) and English Eccentricity and (historical) English Scientific Prowess are some way related.
Great post! I would think finding someone with similar interests who often agrees with you is easy. But you'll get the most value from talking to correct people with very different interests. That's much harder. They'll make mistakes you wouldn't, but also get stuff right you wouldn't think of.
Thank you.
I would think finding someone with similar interests who often agrees with you is easy.
I saw the person's correct answers before I saw their interests; it was an interesting coincidence, but not what I used to evaluate the person's correctness factor.
A possibility worth considering is that correct contrarians share interests by some mechanism related to the general factor of correctness. It also seems easier to work your way from the properties of correct contrarians that you already know about to the general factor of correctness, as opposed to trying to step outside of yourself and imagine what it would be like if you agreed with people that you don't. It's true that everyone thinks they're mostly right and that it's hard to distinguish between feeling right when you're wrong and feeling right when you're right without the benefit of hindsight, but I don't think that you should just give up your beliefs because there's some possibility that you're wrong. It seems easier and more responsible to focus on arriving at agreement with others by posing arguments and observing evidence than to be second-guessing yourself all of the time.
On the other hand, I agree with you in a sense and do very much value people whose thoughts simply fall into completely different grooves than mine. Massimo Pigliucci is my textbook example. A lot of the time when I hear him say things I think he's wrong. A lot of the times I settle on the belief that he is; but he also surprises me weirdly more often than anyone who thinks like him. Also, people like that offer a useful sample of the sort of mistakes that you should expect if you want to persuade others who make them.
OkCupid also succeeds in matching based on interests and I think there are more people signed up for it.
On OkCupid it would be weird if you said that your primary purpose of registering was to find people whom you could recommend books.
So everyone's clear, this service Alikewise does actually have a link that just says "recommend a book to this person." It would probably be a good place to do a one-shot ad hoc analysis of members' book preferences and recommend the right books to the people you consider most likely to appreciate it, and maybe return every year or so or more if userbase growth spikes; but OkCupid and other non-dating sites almost certainly have more relevant data, if only you had a socially acceptable mechanism of suggesting books and doing nothing else bad, unlike fooling people into thinking you're looking for a mate or sending unsolicited messages on social media platforms. Not elegant solutions. As is you could only use the other data to possibly refine your estimates of Alikewise's users' preferences, and probably not much else.
Thanks. I'm curious about why you preferred to write this comment anonymously, but I also won't insist on the matter any further if you prefer not to divulge that information.
On a somewhat tangential topic:
They acknowledged individual and average IQ differences and realized the correct policy implications.
Where could i find some links about the perspective you mention here? About IQ as a trait, and related superior policies.
(It sounds like it's very different from my worldview, but i have a personal policy of trying to understand unfamiliar worldviews.)
Bell Curve might be a good start. A just-published book is Hive Mind: How Your Nation’s IQ Matters So Much More Than Your Own.
The Greek myth of Cassandra ("A common version of her story is that, in an effort to seduce her, Apollo gave her the power of prophecy—but when she refused him, he spat into her mouth to inflict a curse that nobody would ever believe her prophecies.") shows that the ancient Greeks probably thought that people who knew the truth often had difficulty being believed, and this imposed great costs on communities. (The Greek myths can be thought of as guides as to how you are supposed to live.)
This reminds me of how I met Nate Soares. He came to a few LessWrong meetups (his first ones), and I dismissed him because he was talking about a bunch of technical things that didn't seem very interesting to me. (I've was much more interested in finding flaws in my own emotional thinking then in discussing things like many worlds quantum mechanics or decision theory.)
I wrote him off as not-a-very-interesting person. Some of it was his interests, I was also a little offput by his intensity and took it as a sign of bad social skills. These days I read and re-read his blog and have gotten enormous gains from doing so, and he's off doing wonderful things.
Very interesting myth! Fortunately Apollo only spat in her mouth and not everyone's (ew!), so that Cassandra will believe someone who arrives at the contents of her prophecies independently. (That's really the sort of thing I'm getting at.)
Relevant but what do y'all think about the Trump supporter attitude to sexism? Does it need to be contrarian or just part of a different subculture?
Any true belief that people generally cannot arrive at by copying the beliefs of the people around them. The less likely it is that any given person can arrive at the true belief by conforming with someone else's opinion, the more confident we can be that the people who hold the belief have a high correctness factor. It's a cluster, so there can be more or less typical members, and the cluster can change as various beliefs change in frequency within the population. Atheism would have been a correct contrarian belief a long time ago, but not anymore, or at least not contrarian enough to be useful for our purposes.
Related to: The Correct Contrarian Cluster, The General Factor of Correctness
(Content note: Explicitly about spreading rationalist memes, increasing the size of the rationalist movement, and proselytizing. I also regularly use the word 'we' to refer to the rationalist community/subculture. You might prefer not to read this if you don't like that sort of thing and/or you don't think I'm qualified to write about that sort of thing and/or you're not interested in providing constructive criticism.)
I've tried to introduce a number of people to this culture and the ideas within it, but it takes some finesse to get a random individual from the world population to keep thinking about these things and apply them. My personal efforts have been very hit-or-miss. Others have told me that they've been more successful. But I think there are many people that share my experience. This is unfortunate: we want people to be more rational and we want more rational people.
At any rate, this is not about the art of raising the sanity waterline, but the more general task of spreading rationalist memes. Some people naturally arrive at these ideas, but they usually have to find them through other people first. This is really about all of the people in the world who are like you probably were before you found this culture; the people who would care about it, and invest in it, as it is right now, if only they knew it existed.
I'm going to be vague for the sake of anonymity, but here it goes:
I was reading a book review on Amazon, and I really liked it. The writer felt like a kindred spirit. I immediately saw that they were capable of coming to non-obvious conclusions, so I kept reading. Then I checked their review history in the hope that I would find other good books and reviews. And it was very strange.
They did a bunch of stuff that very few humans do. They realized that nuclear power has risks but that the benefits heavily outweigh the risks given the appropriate alternative, and they realized that humans overestimate the risks of nuclear power for silly reasons. They noticed when people were getting confused about labels and pointed out the general mistake, as well as pointing out what everyone should really be talking about. They acknowledged individual and average IQ differences and realized the correct policy implications. They really understood evolution, they took evolutionary psychology seriously, and they didn't care if it was labeled as sociobiology. They used the word 'numerate.'
And the reviews ranged over more than a decade of time. These were persistent interests.
I don't know what other people do when they discover that a stranger like this exists, but the first thing that I try to do is talk to them. It's not like I'm going to run into them on the sidewalk.
Amazon had no messaging feature that I could find, so I looked for a website, and I found one. I found even more evidence, and that's certainly what it was. They were interested in altruism, including how it goes wrong; computer science; statistics; psychology; ethics; coordination failures; failures of academic and scientific institutions; educational reform; cryptocurrency, etc. At this point I considered it more likely than not that they already knew everything that I wanted to tell them, and that they already self-identified as a rationalist, or that they had a contrarian reason for not identifying as such.
So I found their email address. I told them that they were a great reviewer, that I was surprised that they had come to so many correct contrarian conclusions, and that, if they didn't already know, there was a whole culture of people like them.
They replied in ten minutes. They were busy, but they liked what I had to say, and as a matter of fact, a friend had already convinced them to buy Rationality: From AI to Zombies. They said they hadn't read much relative to the size of the book because it's so large, but they loved it so far and they wanted to keep reading.
(You might postulate that I found a review by a user like this on a different book because I was recommended this book and both of us were interested in Rationality: From AI to Zombies. However, the first review I read by this user was for a book on unusual gardening methods, that I found in a search for books about gardening methods. For the sake of anonymity, however, my unusual gardening methods must remain a secret. It is reasonable to postulate that there would be some sort of sampling bias like the one that I have described, but given what I know, it is likely that this is not that. You certainly could still postulate a correlation by means of books about unusual gardening methods, however.)
Maybe that extra push made the difference. Maybe if there hadn't been a friend, I would've made the difference.
Who knew that's how my morning would turn out?
As I've said in some of my other posts, but not in so many words, maybe we should start doing this accidentally effective thing deliberately!
I know there's probably controversy about whether or not rationalists should proselytize, but I've been in favor of it for awhile. And if you're like me, then I don't think this is a very special effort to make. I'm sure sometimes you see a little thread, and you think, "Wow, they're a lot like me; they're a lot like us, in fact; I wonder if there are other things too. I wonder if they would care about this."
Don't just move on! That's Bayesian evidence!
I dare you to follow that path to its destination. I dare you to reach out. It doesn't cost much.
And obviously there are ways to make yourself look creepy or weird or crazy. But I said to reach out, not to reach out badly. If you could figure out how to do it right, it could have a large impact. And these people are likely to be pretty reasonable. You should keep a look out in the future.
Speaking of the future, it's worth noting that I ended up reading the first review because of an automated Amazon book recommendation and subsequent curiosity. You know we're in the data. We are out there and there are ways to find us. In a sense, we aren't exactly low-hanging fruit. But in another sense, we are.
I've never read a word of the Methods of Rationality, but I have to shoehorn this in: we need to write the program that sends a Hogwarts acceptance letter to witches and wizards on their eleventh birthday.