Let's talk about the LessWrong Survey.

First and foremost, if you took the survey and hit 'submit', your information was saved and you don't have to take it again.

Your data is safe, nobody took it or anything it's not like that. If you took the survey and hit the submit button, this post isn't for you.

For the rest of you, I'll put it plainly: I screwed up.

This LessWrong Survey had the lowest turnout since Scott's original survey in 2009. I'll admit I'm not entirely sure why that is, but I have a hunch and most of the footprints lead back to me. The causes I can finger seem to be the diaspora, poor software, poor advertising, and excessive length.

The Diaspora

As it stands, this years LessWrong survey got about 300 completed responses. This can be compared with the previous one in 2016 which got over 1600. I think one critical difference between this survey and the last was its name. Last year the survey focused on figuring out where the 'Diaspora' was and what venues had gotten users now that LessWrong was sort of the walking dead. It accomplished that well I think, and part of the reason why is I titled it the LessWrong Diaspora Survey. That magic word got far off venues to promote it even when I hadn't asked them to. The survey was posted by Scott Alexander, Ozy Frantz, and others to their respective blogs and pretty much everyone 'involved in LessWrong' to one degree or another felt like it was meant for them to take. By contrast, this survey was focused on LessWrong's recovery and revitalization, so I dropped the word Diaspora from it and this seemed to have caused a ton of confusion. Many people I interviewed to ask why they hadn't taken the survey flat out told me that even though they were sitting in a chatroom dedicated to SSC, and they'd read the sequences, the survey wasn't about them because they had no affiliation with LessWrong. Certainly that wasn't the intent I was trying to communicate.

Poor Software

When I first did the survey in 2016, taking over from Scott I faced a fairly simple problem: How do I want to host the survey? I could do it the way Scott had done it, using Google Forms as a survey engine, but this made me wary for a few reasons. One was that I didn't really have a Google account set up that I'd feel comfortable hosting the survey from, another was that I had been unimpressed with what I'd seen from the Google Forms software up to that point in terms of keeping data sanitized on entry. More importantly, it kind of bothered me that I'd be basically handing your data over to Google. This dataset includes a large number of personal questions that I'm not sure most people want Google to have definitive answers on. Moreover I figured: Why the heck do I need Google for this anyway? This is essentially just a webform backed by a datastore, i.e some of the simplest networking technology known to man in 2016. But I didn't want to write it myself, didn't need to write it myself this is the sort of thing there should be a dozen good self hosted solutions for.

There should be, but there's really only LimeSurvey. If I had to give this post an alternate title, it would be "LimeSurvey: An anti endorsement".

I could go on for pages about what's wrong with LimeSurvey, but it can probably be summed up as "the software is bloated and resists customization". It's slow, it uses slick graphics but fails to entirely deliver on functionality, its inner workings are kind of baroque, it's the sort of thing I probably should have rejected on principle and written my own. However at that time the survey was incredibly overdue, so I felt it would be better to just get out something expedient since everyone was already waiting for it anyway. And the thing is, in 2016 it went well. We got over 3000 responses including both partial and complete. So walking away from that victory and going into 2017, I didn't really think too hard about the choice to continue using it.

A couple of things changed between 2016 and our running the survey in 2017:

Hosting - My hosting provider, a single individual who sets up strong networking architectures in his basement, had gotten a lot busier since 2016 and wasn't immediately available to handle any issues. The 2016 survey had a number of birthing pains, and his dedicated attention was part of the reason why we were able to make it go at all. Since he wasn't here this time, I was more on my own in fixing things.

Myself - I had also gotten a lot busier since 2016. I didn't have nearly as much slack as I did the last time I did it. So I was sort of relying on having done the whole process in 2016 to insulate me from opening the thing up to a bunch of problems.

Both of these would prove disastrous, as when I started the survey this time it was slow, it had a variety of bugs and issues I had only limited time to fix, and the issues just kept coming, even more than in 2016 like it had decided now when I truly didn't have the energy to spare was when things should break down. These mostly weren't show stopping bugs though, they were minor annoyances. But every minor annoyance reduced turnout, and I was slowly bleeding through the pool of potential respondents by leaving them unfixed.

The straw that finally broke the camels back for me was when I woke up to find that this message was being shown to most users coming to take the survey:

Message Shown To Survey Respondents Telling Them Their Responses 'cannot be saved'.

"Your responses cannot be saved"? This error meant for when someone had messed up cookies was telling users a vicious lie: That the survey wasn't working right now and there was no point in them taking it.

Looking at this in horror and outrage, after encountering problem after problem mixed with low turnout, I finally pulled the plug.

Poor Advertising

As one email to me mentioned, the 2017 survey didn't even get promoted to the main section of the LessWrong website. This time there were no links from Scott Alexander, nor the myriad small stakeholders that made it work last time. I'm not blaming them or anything, but as a consequence many people who I interviewed to ask about why they hadn't taken the survey had not even heard it existed. Certainly this had to have been significantly responsible for reduced turnout compared to last time.

Excessive Length

Of all the things people complained about when I interviewed them on why they hadn't taken the survey, this was easily the most common response. "It's too long."

This year I made the mistake of moving back to a single page format. The problem with a single page format is that it makes it clear to respondents just how long the survey really is. It's simply too long to expect most people to complete it. And before I start getting suggestions for it in the comments, the problem isn't actually that it needs to be shortened, per se. The problem is that to investigate every question we might want to know about the community, it really needs to be broken into more than one survey. Especially when there are stakeholders involved who would like to see a particular section added to satisfy some questions they have.

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure. Further gamification could be added to help make it a little more fun for people. Which leads into...

The Survey Is Too Much Work For One Person

What we need isn't a guardian of the survey, it's really more like a survey committee. I would be perfectly willing (and plan to) chair such a committee, but I frankly need help. Writing the survey, hosting it without flaws, theming it so that it looks nice, writing any new code or web things so that we can host it without bugs, comprehensively analyzing the thing, it's a damn lot of work to do it right and so far I've kind of been relying on the generosity of my friends for it. If there are other people who really care about the survey and my ability to do it, consider this my recruiting call for you to come and help. You can mail me here on LessWrong, post in the comments, or email me at jd@fortforecast.com. If that's something you would be interested in I could really use the assistance.

What Now?

Honestly? I'm not sure. The way I see it my options look something like:

Call It A Day And Analyze What I've Got - N=300 is nothing to sneeze at, theoretically I could just call this whole thing a wash and move on to analysis.

Try And Perform An Emergency Migration - For example, I could try and set this up again on Google Forms. Having investigated that option, there's no 'import' button on Google forms so the survey would need to be reentered manually for all hundred-and-a-half questions.

Fix Some Of The Errors In LimeSurvey And Try Again On Different Hosting - I considered doing this too, but it seemed to me like the software was so clunky that there was simply no reasonable expectation this wouldn't happen again. LimeSurvey also has poor separation between being able to edit the survey and view the survey results, I couldn't delegate the work to someone else because that could theoretically violate users privacy.

These seem to me like the only things that are possible for this survey cycle, at any rate an extension of time would be required for another round. In the long run I would like to organize a project to write a new software from scratch that fixes these issues and gives us a site multiple stakeholders can submit surveys to which might be too niche to include in the current LessWrong Survey format.

I'm welcome to other suggestions in the comments, consider this my SOS.

 

New Comment
18 comments, sorted by Click to highlight new comments since: Today at 9:42 AM

Thank you for writing and posting this. It's not easy at all to admit that you screwed up, but very useful for others in the community (what use it is to yourself, of course, only you can say!).

RE: assistance with the survey in the future: I am quite willing to provide hosting (via my excellent and reliable hosting provider, NearlyFreeSpeech.net). And (as you no doubt already know), I'll gladly continue to offer design/theming/etc. assistance.

RE: what to do about this year's survey, going forward: my admittedly limited investigation of LimeSurvey gave me the impression that it is unwieldy and convoluted to deploy, so the "fix and try again with a different host" option seems fraught with risk of more screw-ups. I know little about Google Forms, but suspect that migration with preservation of experimental integrity is impossible. I say: call it a day.

First, thank you for admitting a screwup and therefore making it easier for the next person to do so.

Second, I think the survey is valuable and useful both for the community at large and to a personal project of mine, enough to donate my own time into it. What kind of people would you want to answer your SOS? I can make websites and theme things, I can do really basic stats analysis, I don't mind data entry so if the best way to import questions to Google Forms is to retype it I can cheerfully do that. I've got a decent amount of experience with Forms, which includes a useful trick for really long surveys.

Thanks for doing this. I think the survey is pretty valuable and would definitely like to see more people involved to make it happen. If money turned out to be an obstacle* I'd be willing to pay for crowdfunding stuff to make it happen.

(* I don't actually think money will turn out to easily translate into ability-to-do-it, for reasons )

My tentative thought/vote is "make it easier to take parts of the survey and auto-save as you go".

Also I forgot to acknowledge your incredibly generous offer of money.

The good news for your bank account is that money is not a bottleneck, the kind of work that needs to be done is simply not on the table for the amount of money we could conceivably raise to try and have it done.

I used to be a really big fan of EY's The Unit Of Caring essay (and still am), but I've since come to understand that it's seemingly bullettight reasoning does in fact have some leaky holes. One of which is that the amounts of stuff people are willing to give you as gifts in kind through free professional contributions and the like simply dwarfs the available funds that might be donated for the vast majority of small causes. For example, employing the services of Said Achmiz would normally cost a staggering amount of money, but as a 'loyal troop' he's willing to put in organic effort on the project that I'd be hard pressed to pay for at any amount.

My tentative thought/vote is "make it easier to take parts of the survey and auto-save as you go".

That is not actionable, unless you have a magic way for me to make that happen.

Oh, I was referring to this which had sounded like an actionable plan you were considering:

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure. Further gamification could be added to help make it a little more fun for people. Which leads into...

Ah yes, that very much is and both things (easy-to-take-parts and auto-save) are definitely must-have features for that project.

I didn't even know that the survey was happening, sorry.

If you do decide to keep running the survey for a little longer, I'd take it, if that data point helps.

if you took the survey and hit 'submit', your information was saved and you don't have to take it again.

I'm not sure this is true.

I took the survey over two sessions, where I filled out most of the multiple choice questions in the first session, and most of the long form questions in the second. When I did my final submitting, I also downloaded a copy of my answers. I was annoyed to find that it didn't contain my long form responses. At the time, I had assumed that this was just an export error, but you might want to verify that across sessions, at least long form responses from the additional sessions get saved.

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure.

It may be worth collaborating with the EA community on this, since there is considerable overlap, both in participants and in the kinds of surveys people may be interested in.

Yeah, this survey was pretty disappointing - I had to stop myself from making a negative comment after I took it (though someone else had). I am glad you realized it, too I guess. Even things like starting with a bunch of questions about the new lesswrong-inspired site, and the spacing between words were off, let alone the things you mention.

I am honestly a little sad that someone more competent in matters like these like gwern didn't take over (as I always assumed will happen if yvain gave up on doing it), because half-hearted attempts like this probably hurt a lot more than help - e.g. someone coming back in 4 months and seeing how we've went down to only 300 (!) responders in the annual survey is going to assume LW is even more dead than it really is. This reasoning goes beyond the survey.

I did intend to take over the survey if Yvain stopped, although I didn't tell him in the hopes he would keep doing it rather than turn it over immediately. I'm not sure I would take it over now: the results seem increasingly irrelevant as I'm not sure the people taking the survey overlap much anymore as with the original LW surveys in 2009.

Note, last year's survey was also run by /u/ingres

This LessWrong Survey had the lowest turnout since Scott's original survey in 2009

What is the average amount of turnout per survey, and what has the turnout been year by year?

I believe the following is a comprehensive list of LW-wide surveys and their turnouts. Months are those when the results were reported.

  1. May 2009, 166
  2. December 2011, 1090
  3. December 2012, 1195
  4. January 2014, 1636
  5. January 2015, 1503
  6. May 2016, 3083

And now in the current case we have "about 300" responses, although results haven't been written up and published. I hope they will be. If the only concern is sample size, well, 300 beats zero!

For myself, I can't recall if I finished it or not, but I have to admit that there are now enough surveys in the world, and the lesswrong-specific one is unlikely to hold my attention long enough to put much thought into answering. It was awesome for the first few years, but the definition of the community and the centrality of the site has changed such that I don't see the value as much anymore.

I really appreciate the work that went into this - it was a noble attempt, but I believe the world and this community has changed such that it can't be as good as the early ones.

I think "call it a day" is your best bet for 2017. If you/we can field the resources, finding a way to make it more incremental for 2018+ (like smaller sections throughout the year, remembering my previous responses so I don't have to reconsider things that haven't changed, and possibly some federation with diaspora members who add/suggest questions rather than their own mini-surveys) would be awesome in the future.

I'm one of the ~300 people who took the survey.

I would not have thought the process was screwed up unless you called it a screw up yourself. In fact, I'd suggest that it was not, in fact, screwed up much at all. A much lower turnout doesn't seem very surprising to me, or a sign of personal failure on your part (though it might be data that reveals a truth you are personally sad about).

I took the survey because I thought it was the only place to deliver a systematic and democratic expression of my preferences for whether or how to change the LW website (which lots of other people probably don't care about, but I do) and I wanted to say something in my answers along the lines of "please don't 'fix' LW with no regard for what made LW work as much as it did and thereby make it worse, and please please change the LEAST AMOUNT that can possible be changed and see how that works for a bit first, then worry about so-called improvements as second order steps, like if possible have it look exactly the same at the UX level for the first take, while being backed by a new system".

If you want my free advice about the survey, the thing to do would be to set up a Google form "diaspora survey" after getting buy in from Scott and others to promote the diaspora survey as "relevant to my blog's audience".

Next year have a diaspora survey for sure, and maybe a narrowly scoped LW survey as extra credit if you have time.

The difference in response rates to the surveys is probably an important number in itself ;-)

The surveys feels like an area where good enough is the enemy of at all and thus the second order "good enough" is to just do the least effort thing possible, fix any last minutes things that feel extremely painful, and sweep problems under the rug, and claim retrospectively to have intended whatever the good parts of the outcome was, and blame everything else on lack of time.

If you don't like Google forms because privacy... so what? It isn't like people haven't already had their privacy invaded a lot by Google already, even for these same kinds of questions a year or two ago, so what's another year's worth of data?

Then just add a small proviso to summaries of the process along the lines of "hey, this isn't professional work, it is a hobby, so if you want something more or different done, feel free to send me email to volunteer to do tasks like X, Y, or Z".

Scott/Yvain regularly added such provisos and his combination of wry self deprecation and doing something was really really impressive from a distance :-)

I usually look out for the surveys, but until I opened this article I never even knew there was a survey for this year... so yeah, poor advertising.