Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Raemon 08 October 2017 05:59:40AM 7 points [-]

Thanks for doing this. I think the survey is pretty valuable and would definitely like to see more people involved to make it happen. If money turned out to be an obstacle* I'd be willing to pay for crowdfunding stuff to make it happen.

(* I don't actually think money will turn out to easily translate into ability-to-do-it, for reasons )

My tentative thought/vote is "make it easier to take parts of the survey and auto-save as you go".

Comment author: ingres 08 October 2017 07:00:24AM 5 points [-]

Also I forgot to acknowledge your incredibly generous offer of money.

The good news for your bank account is that money is not a bottleneck, the kind of work that needs to be done is simply not on the table for the amount of money we could conceivably raise to try and have it done.

I used to be a really big fan of EY's The Unit Of Caring essay (and still am), but I've since come to understand that it's seemingly bullettight reasoning does in fact have some leaky holes. One of which is that the amounts of stuff people are willing to give you as gifts in kind through free professional contributions and the like simply dwarfs the available funds that might be donated for the vast majority of small causes. For example, employing the services of Said Achmiz would normally cost a staggering amount of money, but as a 'loyal troop' he's willing to put in organic effort on the project that I'd be hard pressed to pay for at any amount.

Comment author: Raemon 08 October 2017 06:20:31AM 1 point [-]

Oh, I was referring to this which had sounded like an actionable plan you were considering:

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure. Further gamification could be added to help make it a little more fun for people. Which leads into...

Comment author: ingres 08 October 2017 06:50:33AM 0 points [-]

Ah yes, that very much is and both things (easy-to-take-parts and auto-save) are definitely must-have features for that project.

Comment author: Raemon 08 October 2017 05:59:40AM 7 points [-]

Thanks for doing this. I think the survey is pretty valuable and would definitely like to see more people involved to make it happen. If money turned out to be an obstacle* I'd be willing to pay for crowdfunding stuff to make it happen.

(* I don't actually think money will turn out to easily translate into ability-to-do-it, for reasons )

My tentative thought/vote is "make it easier to take parts of the survey and auto-save as you go".

Comment author: ingres 08 October 2017 06:06:25AM 0 points [-]

My tentative thought/vote is "make it easier to take parts of the survey and auto-save as you go".

That is not actionable, unless you have a magic way for me to make that happen.

HOWTO: Screw Up The LessWrong Survey and Bring Great Shame To Your Family

25 ingres 08 October 2017 03:43AM

Let's talk about the LessWrong Survey.

First and foremost, if you took the survey and hit 'submit', your information was saved and you don't have to take it again.

Your data is safe, nobody took it or anything it's not like that. If you took the survey and hit the submit button, this post isn't for you.

For the rest of you, I'll put it plainly: I screwed up.

This LessWrong Survey had the lowest turnout since Scott's original survey in 2009. I'll admit I'm not entirely sure why that is, but I have a hunch and most of the footprints lead back to me. The causes I can finger seem to be the diaspora, poor software, poor advertising, and excessive length.

The Diaspora

As it stands, this years LessWrong survey got about 300 completed responses. This can be compared with the previous one in 2016 which got over 1600. I think one critical difference between this survey and the last was its name. Last year the survey focused on figuring out where the 'Diaspora' was and what venues had gotten users now that LessWrong was sort of the walking dead. It accomplished that well I think, and part of the reason why is I titled it the LessWrong Diaspora Survey. That magic word got far off venues to promote it even when I hadn't asked them to. The survey was posted by Scott Alexander, Ozy Frantz, and others to their respective blogs and pretty much everyone 'involved in LessWrong' to one degree or another felt like it was meant for them to take. By contrast, this survey was focused on LessWrong's recovery and revitalization, so I dropped the word Diaspora from it and this seemed to have caused a ton of confusion. Many people I interviewed to ask why they hadn't taken the survey flat out told me that even though they were sitting in a chatroom dedicated to SSC, and they'd read the sequences, the survey wasn't about them because they had no affiliation with LessWrong. Certainly that wasn't the intent I was trying to communicate.

Poor Software

When I first did the survey in 2016, taking over from Scott I faced a fairly simple problem: How do I want to host the survey? I could do it the way Scott had done it, using Google Forms as a survey engine, but this made me wary for a few reasons. One was that I didn't really have a Google account set up that I'd feel comfortable hosting the survey from, another was that I had been unimpressed with what I'd seen from the Google Forms software up to that point in terms of keeping data sanitized on entry. More importantly, it kind of bothered me that I'd be basically handing your data over to Google. This dataset includes a large number of personal questions that I'm not sure most people want Google to have definitive answers on. Moreover I figured: Why the heck do I need Google for this anyway? This is essentially just a webform backed by a datastore, i.e some of the simplest networking technology known to man in 2016. But I didn't want to write it myself, didn't need to write it myself this is the sort of thing there should be a dozen good self hosted solutions for.

There should be, but there's really only LimeSurvey. If I had to give this post an alternate title, it would be "LimeSurvey: An anti endorsement".

I could go on for pages about what's wrong with LimeSurvey, but it can probably be summed up as "the software is bloated and resists customization". It's slow, it uses slick graphics but fails to entirely deliver on functionality, its inner workings are kind of baroque, it's the sort of thing I probably should have rejected on principle and written my own. However at that time the survey was incredibly overdue, so I felt it would be better to just get out something expedient since everyone was already waiting for it anyway. And the thing is, in 2016 it went well. We got over 3000 responses including both partial and complete. So walking away from that victory and going into 2017, I didn't really think too hard about the choice to continue using it.

A couple of things changed between 2016 and our running the survey in 2017:

Hosting - My hosting provider, a single individual who sets up strong networking architectures in his basement, had gotten a lot busier since 2016 and wasn't immediately available to handle any issues. The 2016 survey had a number of birthing pains, and his dedicated attention was part of the reason why we were able to make it go at all. Since he wasn't here this time, I was more on my own in fixing things.

Myself - I had also gotten a lot busier since 2016. I didn't have nearly as much slack as I did the last time I did it. So I was sort of relying on having done the whole process in 2016 to insulate me from opening the thing up to a bunch of problems.

Both of these would prove disastrous, as when I started the survey this time it was slow, it had a variety of bugs and issues I had only limited time to fix, and the issues just kept coming, even more than in 2016 like it had decided now when I truly didn't have the energy to spare was when things should break down. These mostly weren't show stopping bugs though, they were minor annoyances. But every minor annoyance reduced turnout, and I was slowly bleeding through the pool of potential respondents by leaving them unfixed.

The straw that finally broke the camels back for me was when I woke up to find that this message was being shown to most users coming to take the survey:

Message Shown To Survey Respondents Telling Them Their Responses 'cannot be saved'.

"Your responses cannot be saved"? This error meant for when someone had messed up cookies was telling users a vicious lie: That the survey wasn't working right now and there was no point in them taking it.

Looking at this in horror and outrage, after encountering problem after problem mixed with low turnout, I finally pulled the plug.

Poor Advertising

As one email to me mentioned, the 2017 survey didn't even get promoted to the main section of the LessWrong website. This time there were no links from Scott Alexander, nor the myriad small stakeholders that made it work last time. I'm not blaming them or anything, but as a consequence many people who I interviewed to ask about why they hadn't taken the survey had not even heard it existed. Certainly this had to have been significantly responsible for reduced turnout compared to last time.

Excessive Length

Of all the things people complained about when I interviewed them on why they hadn't taken the survey, this was easily the most common response. "It's too long."

This year I made the mistake of moving back to a single page format. The problem with a single page format is that it makes it clear to respondents just how long the survey really is. It's simply too long to expect most people to complete it. And before I start getting suggestions for it in the comments, the problem isn't actually that it needs to be shortened, per se. The problem is that to investigate every question we might want to know about the community, it really needs to be broken into more than one survey. Especially when there are stakeholders involved who would like to see a particular section added to satisfy some questions they have.

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure. Further gamification could be added to help make it a little more fun for people. Which leads into...

The Survey Is Too Much Work For One Person

What we need isn't a guardian of the survey, it's really more like a survey committee. I would be perfectly willing (and plan to) chair such a committee, but I frankly need help. Writing the survey, hosting it without flaws, theming it so that it looks nice, writing any new code or web things so that we can host it without bugs, comprehensively analyzing the thing, it's a damn lot of work to do it right and so far I've kind of been relying on the generosity of my friends for it. If there are other people who really care about the survey and my ability to do it, consider this my recruiting call for you to come and help. You can mail me here on LessWrong, post in the comments, or email me at jd@fortforecast.com. If that's something you would be interested in I could really use the assistance.

What Now?

Honestly? I'm not sure. The way I see it my options look something like:

Call It A Day And Analyze What I've Got - N=300 is nothing to sneeze at, theoretically I could just call this whole thing a wash and move on to analysis.

Try And Perform An Emergency Migration - For example, I could try and set this up again on Google Forms. Having investigated that option, there's no 'import' button on Google forms so the survey would need to be reentered manually for all hundred-and-a-half questions.

Fix Some Of The Errors In LimeSurvey And Try Again On Different Hosting - I considered doing this too, but it seemed to me like the software was so clunky that there was simply no reasonable expectation this wouldn't happen again. LimeSurvey also has poor separation between being able to edit the survey and view the survey results, I couldn't delegate the work to someone else because that could theoretically violate users privacy.

These seem to me like the only things that are possible for this survey cycle, at any rate an extension of time would be required for another round. In the long run I would like to organize a project to write a new software from scratch that fixes these issues and gives us a site multiple stakeholders can submit surveys to which might be too niche to include in the current LessWrong Survey format.

I'm welcome to other suggestions in the comments, consider this my SOS.

 

Comment author: berekuk 05 October 2017 01:11:08PM 1 point [-]

So, what happened?

This post is hidden from Main and the survey "is expired and no longer available", even though the post mentions that it should run for 10 more days. I wanted to share it with Russian LW community, will it be back in some form later?

Comment author: ingres 07 October 2017 02:06:34AM 0 points [-]

Right sorry, I got distracted by life a bit there. I'll write up a post explaining what happened to the LW Survey soon and where I'm planning to go from here.

In response to Feedback on LW 2.0
Comment author: ingres 01 October 2017 07:23:49PM 6 points [-]

Hi, over here at the LessWrong Survey team we've also been collecting reactions to LW 2.0:

https://lwsurvey.obormot.net/Reports/2017-EarlyReport1

https://lwsurvey.obormot.net/Main/ResponsesFromSSC

Comment author: Benito 17 September 2017 07:06:45PM 5 points [-]

FYI R:AZ is shorter than The Sequences by a factor of 2, which I think is a substantial improvement. Not that it couldn't be shorter still ;-)

Comment author: ingres 17 September 2017 08:41:54PM 1 point [-]

Oh huh, TIL. Thanks!

Comment author: DragonGod 17 September 2017 01:37:12AM 1 point [-]

I expect that for most domains (possibly all), Lesswrong consensus is more likely to be right than wrong. I haven't yet seen reason to believe otherwise; (it seems you have?).

Comment author: ingres 17 September 2017 03:42:37PM *  1 point [-]

Just so we're clear here:

Profession (Results from 2016 LessWrong Survey)

Art: +0.800% 51 2.300%

Biology: +0.300% 49 2.200%

Business: -0.800% 72 3.200%

Computers (AI): +0.700% 79 3.500%

Computers (other academic, computer science): -0.100% 156 7.000%

Computers (practical): -1.200% 681 30.500%

Engineering: +0.600% 150 6.700%

Finance / Economics: +0.500% 116 5.200%

Law: -0.300% 50 2.200%

Mathematics: -1.500% 147 6.600%

Medicine: +0.100% 49 2.200%

Neuroscience: +0.100% 28 1.300%

Philosophy: 0.000% 54 2.400%

Physics: -0.200% 91 4.100%

Psychology: 0.000% 48 2.100%

Other: +2.199% 277 12.399%

Other "hard science": -0.500% 26 1.200%

Other "social science": -0.200% 48 2.100%

The LessWrong consensus is massively overweighted in one particular field of expertise (computing) with some marginal commentators who happen to do other things.

As for evidence to believe otherwise, how about all of recorded human history? When has there ever been a group whose consensus was more likely to be right than wrong in all domains of human endeavor? What a ludicrous hubris, the sheer arrogance on display in this comment cowed me, I briefly considered whether I'm hanging out in the right place by posting here.

Comment author: DragonGod 17 September 2017 09:32:00AM 2 points [-]

Pie in the sky: the Yudkowsky sequences edited, condensed, and put into an Aristotelian/Thomsian/Scholastic order. (Not that Aristotle or Thomas Aquinas ever did this but the tradition of the scholastics was always to get this pie in the sky.) It might be interesting to see what an experienced book editor would advise doing with this material.

Doesn't Rationality: From AI to Zombies achieve this already?

Comment author: ingres 17 September 2017 01:46:37PM 0 points [-]

Rat:A-Z is like...a slight improvement over EY's first draft of the sequences. I think when Craig says condensed he has much more substantial editing in mind.

Comment author: Habryka 16 September 2017 11:26:35PM 1 point [-]

A wiki feels too high of a barrier to entry to me, though maybe there are some cool new wiki softwares that are better than what I remember.

For now I feel like having an about page on LessWrong that has links to all the posts, and tries to summarize the state of discussion and information is the better choice, until we reach the stage where LW gets a lot more open-source engagement and is being owned more by a large community again.

Comment author: ingres 17 September 2017 12:32:21AM 3 points [-]

Seconding SaidAchmiz on pmwiki, it's what we use for our research project on effective online organizing and it works wonders. It's also how I plan to host and edit the 2017 survey results.

As far as the high barrier to entry goes, I'll repeat here my previous offer to set up a high quality instance of pmwiki and populate it with a reasonable set of initial content - for free. I believe this is sufficiently important that if the issue is you just don't have the capacity to get things started I'm fully willing to help on that front.

View more: Next