Lesswrong 2016 Survey
It’s time for a new survey!
The details of the last survey can be found here. And the results can be found here.
I posted a few weeks back asking for suggestions for questions to include on the survey. As much as we’d like to include more of them, we all know what happens when we have too many questions. The following graph is from the last survey.
http://i.imgur.com/KFTn2Bt.png
(Source: JD’s analysis of 2014 survey data)
Two factors seem to predict if a question will get an answer:
-
The position
-
Whether people want to answer it. (Obviously)
People answer fewer questions as we approach the end. They also skip tricky questions. The least answered question on the last survey was - “what is your favourite lw post, provide a link”. Which I assume was mostly skipped for the amount of effort required either in generating a favourite or in finding a link to it. The second most skipped questions were the digit-ratio questions which require more work, (get out a ruler and measure) compared to the others. This is unsurprising.
This year’s survey is almost the same size as the last one (though just a wee bit smaller). Preliminary estimates suggest you should put aside 25 minutes to take the survey, however you can pause at any time and come back to the survey when you have more time. If you’re interested in helping process the survey data please speak up either in a comment or a PM.
We’re focusing this year particularly on getting a glimpse of the size and shape of the LessWrong diaspora. With that in mind; if possible - please make sure that your friends (who might be less connected but still hang around in associated circles) get a chance to see that the survey exists; and if you’re up to it - encourage them to fill out a copy of the survey.
The survey is hosted and managed by the team at FortForecast, you’ll be hearing more from them soon. The survey can be accessed through http://lesswrong.com/2016survey.
Survey responses are anonymous in that you’re not asked for your name. At the end we plan to do an opt-in public dump of the data. Before publication the row order will be scrambled, datestamps, IP addresses and any other non-survey question information will be stripped, and certain questions which are marked private such as the (optional) sign up for our mailing list will not be included. It helps the most if you say yes but we can understand if you don’t.
Thanks to Namespace (JD) and the FortForecast team, the Slack, the #lesswrong IRC on freenode, and everyone else who offered help in putting the survey together, special thanks to Scott Alexander whose 2014 survey was the foundation for this one.
When answering the survey, I ask you be helpful with the format of your answers if you want them to be useful. For example if a question asks for an number, please reply with “4” not “four”. Going by the last survey we may very well get thousands of responses and cleaning them all by hand will cost a fortune on mechanical turk. (And that’s for the ones we can put on mechanical turk!) Thanks for your consideration.
The survey will be open until the 1st of may 2016
Addendum from JD at FortForecast: During user testing we’ve encountered reports of an error some users get when they try to take the survey which erroneously reports that our database is down. We think we’ve finally stamped it out but this particular bug has proven resilient. If you get this error and still want to take the survey here are the steps to mitigate it:
-
Refresh the survey, it will still be broken. You should see a screen with question titles but no questions.
-
Press the “Exit and clear survey” button, this will reset your survey responses and allow you to try again fresh.
-
Rinse and repeat until you manage to successfully answer the first two questions and move on. It usually doesn’t take more than one or two tries. We haven’t received reports of the bug occurring past this stage.
If you encounter this please mail jd@fortforecast.com with details. Screenshots would be appreciated but if you don’t have the time just copy and paste the error message you get into the email.
Meta - this took 2 hours to write and was reviewed by the slack.
My Table of contents can be found here.
How I infiltrated the Raëlians (and was hugged by their leader)
I was invited by a stranger I met on a plane and actually went to a meeting of Raëlians (known in some LW circles as "the flying saucer cult") in 沖縄, Japan. It was right next to Claude Vorilhon's home, and he came himself for the "ceremony" (?) dressed in a theatrical space-y white uniform, complete with a Jewish-style white cap on his head. When saying his "sermon" (?) he spoke in English and his words were translated into Japanese for the benefit of those who didn't understand. And yes, it's true he talked with me briefly and then hugged me (I understand he does this with all newcomers, and it felt 100% fake to me). I then went on to eat lunch in an 居酒屋 with a group of around 15 members, who were all really friendly and pleasant people. I was actually treated to lunch by them, and afterwards someone gave me a ~20 minute ride to the town I wanted to be in, despite knowing they won't see me ever again.
If you have ever wondered how it is possible that a flying saucer cult has more members than EA, now it's time to learn something.
Note: I hope it's clear that I do not endorse creating cults, nor do I proclaim the EA community's inferiority. It hasn't even crossed my mind when I wrote the above line that any LW'er would take it as a stab they need to defend against. I'm merely pointing to the fact that we can learn from anything, whether it's good or bad, and encouraging a fresh discussion on this after I gathered some new data.
Let's do this as a Q&A session (I'm at work now so I can't write a long post).
Please ask questions in comments.
Attention! Financial scam targeting Less Wrong users
Recently, multiple suspicious user accounts were created on Less Wrong. These accounts don't post any content in the forum. Instead, they are used only to send private messages to the existing users.
Many users have received a copy of the same message, but different variants exist, too. Here are the examples I know about. If you have received a different variant, please post it in a comment below this article:
Hi good day. My boss is interested on donating to MIRI's project and he is wondering if he could send money through you and you donate to miri through your company and thus accelertaing the value created. He wants to use "match donations" as a way of donating thats why he is looking for people in companies like you. I want to discuss more about this so if you could see this message please give me a reply. Thank you!
I don't know yet about anyone who replied and got scammed, so this is all based on indirect evidence. If you got scammed, please tell me. If you are ashamed, I can publish your story anonymously. Your story could help other potential victims.
Most likely, the scheme is the following:
- The scammer will send you money.
- Then they will ask some of the money back because they changed their mind, or they mistakenly sent you more than they wanted, or their financial situation suddenly changed, or whatever.
- After receiving the money from you, they will flag the original transaction as a fraud, so they get back the money they originally sent you, plus the money you sent them back. Then they disappear, or it will turn out they used a stolen identity, etc.
(Thanks to
If you replied to the original message and now you are already in the middle of the process, please inform your bank as soon as possible! Even if the step 2 didn't happen yet, so you can still get out without losing money, warning your bank about the scammer could help other potential victims.
Warning: If you have already received a check or a payment confirmation, and someone is asking you to send the overpayment back quickly, do not send anything. The check or the payment confirmation is fake, and the goal is to make you send money before you find out. (Thanks to
[moderator action] The_Lion and The_Lion2 are banned
Accounts "The_Lion" and "The_Lion2" are banned now. Here is some background, mostly for the users who weren't here two years ago:
User "Eugine_Nier" was banned for retributive downvoting in July 2014. He keeps returning to the website using new accounts, such as "Azathoth123", "Voiceofra", "The_Lion", and he keeps repeating the behavior that got him banned originally.
The original ban was permanent. It will be enforced on all future known accounts of Eugine. (At random moments, because moderators sometimes feel too tired to play whack-a-mole.) This decision is not open to discussion.
Please note that the moderators of LW are the opposite of trigger-happy. Not counting spam, there is on average less than one account per year banned. I am writing this explicitly, to avoid possible misunderstanding among the new users. Just because you have read about someone being banned, it doesn't mean that you are now at risk.
Most of the time, LW discourse is regulated by the community voting on articles and comments. Stupid or offensive comments get downvoted; you lose some karma, then everyone moves on. In rare cases, moderators may remove specific content that goes against the rules. The account ban is only used in the extreme cases (plus for obvious spam accounts). Specifically, on LW people don't get banned for merely not understanding something or disagreeing with someone.
What does "retributive downvoting" mean? Imagine that in a discussion you write a comment that someone disagrees with. Then in a few hours you will find that your karma has dropped by hundreds of points, because someone went through your entire comment history and downvoted all comments you ever wrote on LW; most of them completely unrelated to the debate that "triggered" the downvoter.
Such behavior is damaging to the debate and the community. Unlike downvoting a specific comment, this kind of mass downvoting isn't used to correct a faux pas, but to drive a person away from the website. It has especially strong impact on new users, who don't know what is going on, so they may mistake it for a reaction of the whole community. But even in experienced users it creates an "ugh field" around certain topics known to invoke the reaction. Thus a single user has achieved disproportional control over the content and the user base of the website. This is not desired, and will be punished by the site owners and the moderators.
To avoid rules lawyering, there is no exact definition of how much downvoting breaks the rules. The rule of thumb is that you should upvote or downvote each comment based on the value of that specific comment. You shouldn't vote on the comments regardless of their content merely because they were written by a specific user.
Humans are utility monsters
When someone complains that utilitarianism1 leads to the dust speck paradox or the trolley-car problem, I tell them that's a feature, not a bug. I'm not ready to say that respecting the utility monster is also a feature of utilitarianism, but it is what most people everywhere have always done. A model that doesn't allow for utility monsters can't model human behavior, and certainly shouldn't provoke indignant responses from philosophers who keep right on respecting their own utility monsters.
Making My Peace with Belief
I grew up in an atheistic household.
Almost needless to say, I was relatively hostile towards religion for most of my early life. A few things changed that.
First, the apology of a pastor. A friend of mine was proselytizing at me, and apparently discussed it with his pastor; the pastor apologized to my parents, and explained to my friend he shouldn't be trying to convert people. My friend apologized to me after considering the matter. We stayed friends for a little while afterwards, although I left that school, and we lost contact.
I think that was around the time that I realized that religion is, in addition to being a belief system, a way of life, and not necessarily a bad one.
The next was actually South Park's Mormonism episode, which pointed out that a belief system could be desirable on the merits of the way of life it represented, even if the beliefs themselves are stupid. This tied into Douglas Adam's comment on Feng Shui, that "...if you disregard for a moment the explanation that's actually offered for it, it may be there is something interesting going on" - which is to say, the explanation for the belief is not necessarily the -reason- for the belief, and that stupid beliefs may actually have something useful to offer - which then requires us to ask whether the beliefs are, in fact, stupid.
Which is to say, beliefs may be epistemically irrational while being instrumentally rational.
The next peace I made with belief actually came from quantum physics, and reading about how there were several disparate and apparently contradictory mathematical systems, which all predicted the same thing. It later transpired that they could all be generalized into the same mathematical system, but I hadn't read that far before the isomorphic nature of truth occurred to me; you can have multiple contradictory interpretations of the same evidence that all predict the same thing.
Up to this point, however, I still regarded beliefs as irrational, at least on an epistemological basis.
The next peace came from experiences living in a house that would have convinced most people that ghosts are real, which I have previously written about here. I think there are probably good explanations for every individual experience even if I don't know them, but am still somewhat flummoxed by the fact that almost all the bizarre experiences of my life all revolve around the same physical location. I don't know if I would accept money to live in that house again, which I guess means that I wouldn't put money on the bet that there wasn't something fundamentally odd about the house itself - a quality of the house which I think the term "haunted" accurately conveys, even if its implications are incorrect.
If an AI in a first person shooter dies every time it walks into a green room, and experiences great disutility for death, how many times must it walk into a green room before it decides not to do that anymore? I'm reasonably confident on a rational level that there was nothing inherently unnatural about that house, nothing beyond explanation, but I still won't "walk into the green room."
That was the point at which I concluded that beliefs can be -rational-. Disregard for a moment the explanation that's actually offered for them, and just accept the notion that there may be something interesting going on underneath the surface.
If we were to hold scientific beliefs to the same standard we hold religious beliefs - holding the explanation responsible rather than the predictions - scientific beliefs really don't come off looking that good. The sun isn't the center of the universe; some have called this theory "less wrong" than an earth-centric model of the universe, but that's because the -predictions- are better; the explanation itself is still completely, 100% wrong.
Likewise, if we hold religious beliefs to the same standard we hold scientific beliefs - holding the predictions responsible rather than the explanations - religious beliefs might just come off better than we'd expect.
LessWrong 2.0
Alternate titles: What Comes Next?, LessWrong is Dead, Long Live LessWrong!
You've seen the articles and comments about the decline of LessWrong. Why pay attention to this one? Because this time, I've talked to Nate at MIRI and Matt at Trike Apps about development for LW, and they're willing to make changes and fund them. (I've even found a developer willing to work on the LW codebase.) I've also talked to many of the prominent posters who've left about the decline of LW, and pointed out that the coordination problem could be deliberately solved if everyone decided to come back at once. Everyone that responded expressed displeasure that LW had faded and interest in a coordinated return, and often had some material that they thought they could prepare and have ready.
But before we leap into action, let's review the problem.
Weirdness at the wiki
Richard Kennaway has posted about an edit war on the wiki. Richard, thank you.
Unfortunately, I've only used the wiki a little, and don't have a feeling for why the edit history for an article is inaccessible. Is the wiki broken or has someone found a way to hack it? Let it be known that hacking the wiki is something I'll ban for.
VoiceofRa, I'd like to know why you deleted Gleb's article. Presumably you have some reason for why you think it was unsatisfactory.
I'm also notifying tech in the hope of finding out what happened to the edit history.
Take the EA survey, help the EA movement grow and potentially win $250 to your favorite charity
This year's EA Survey is now ready to be shared! This is a survey of all EAs to learn about the movement and how it can improve. The data collected in the survey is used to help EA groups improve and grow EA. Data is also used to populate the map of EAs, create new EA meetup groups, and create EA Profiles and the EA Donation Registry.
If you are an EA or otherwise familiar with the community, we hope you will take it using this link. All results will be anonymised and made publicly available to members of the EA community. As an added bonus, one random survey taker will be selected to win a $250 donation to their favorite charity.
Please share the survey with others who might be interested using this link rather than the one above: http://bit.ly/1OqsVWo
The Winding Path
The First Step
The first step on the path to truth is superstition. We all start there, and should acknowledge that we start there.
Superstition is, contrary to our immediate feelings about the word, the first stage of understanding. Superstition is the attribution of unrelated events to a common (generally unknown or unspecified) cause - it could be called pattern recognition. The "supernatural" component generally included in the definition is superfluous, because supernatural merely refers to that which isn't part of nature - which means reality -, which is an elaborate way of saying something whose relationship to nature is not yet understood, or else nonexistent. If we discovered that ghosts are real, and identified an explanation - overlapping entities in a many-worlds universe, say - they'd cease to be supernatural and merely be natural.
Just as the supernatural refers to unexplained or imaginary phenomena, superstition refers to unexplained or imaginary relationships, without the necessity of cause. If you designed an AI in a game which, after five rounds of being killed whenever it went into rooms with green-colored walls, started avoiding rooms with green-colored walls, you've developed a good AI. It is engaging in superstition, it has developed an incorrect understanding of the issue. But it hasn't gone down the wrong path - there is no wrong path in understanding, there is only the mistake of stopping. Superstition, like all belief, is only useful if you're willing to discard it.
The Next Step
Incorrect understanding is the first - and necessary - step to correct understanding. It is, indeed, every step towards correct understanding. Correct understanding is a path, not an achievement, and it is pursued, not by arriving at the correct conclusion in the first place, but by testing your ideas and discarding those which are incorrect.
No matter how much intelligent you are, you cannot skip the "incorrect understanding" step of knowledge, because that is every step of knowledge. You must come up with wrong ideas in order to get at the right ones - which will always be one step further. You must test your ideas. And again, the only mistake is stopping, in assuming that you have it right now.
Intelligence is never your bottleneck. The ability to think faster isn't necessarily the ability to arrive at the right answer faster, because the right answer requires many wrong ones, and more importantly, identifying which answers are indeed wrong, which is the slow part of the process.
Better answers are arrived at by the process of invalidating wrong answers.
The Winding Path
The process of becoming Less Wrong is the process of being, in the first place, wrong. It is the state of realizing that you're almost certainly incorrect about everything - but working on getting incrementally closer to an unachievable "correct". It is a state of anti-hubris, and requires a delicate balance between the idea that one can be closer to the truth, and the idea that one cannot actually achieve it.
The art of rationality is the art of walking this narrow path. If ever you think you have the truth - discard that hubris, for three steps from here you'll see it for superstition, and if you cannot see that, you cannot progress, and there your search for truth will end. That is the path of the faithful.
But worse, the path is not merely narrow, but winding, with frequent dead ends requiring frequent backtracking. If ever you think you're closer to the truth - discard that hubris, for it may inhibit you from leaving a dead end, and there your search for truth will end. That is the path of the crank.
The path of rationality is winding and directionless. It may head towards beauty, then towards ugliness; towards simplicity, then complexity. The correct direction isn't the aesthetic one; those who head towards beauty may create great art, but do not find truth. Those who head towards simplicity might open new mathematical doors and find great and useful things inside - but they don't find truth, either. Truth is its own path, found only by discarding what is wrong. It passes through simplicity, it passes through ugliness; it passes through complexity, and also beauty. It doesn't belong to any one of these things.
The path of rationality is a path without destination.
Written as an experiment in the aesthetic of Less Wrong. I'd appreciate feedback into the aesthetic interpretation of Less Wrong, rather than the sense of deep wisdom emanating from it (unless the deep wisdom damages the aesthetic).
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)