Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

HOWTO: Screw Up The LessWrong Survey and Bring Great Shame To Your Family

25 ingres 08 October 2017 03:43AM

Let's talk about the LessWrong Survey.

First and foremost, if you took the survey and hit 'submit', your information was saved and you don't have to take it again.

Your data is safe, nobody took it or anything it's not like that. If you took the survey and hit the submit button, this post isn't for you.

For the rest of you, I'll put it plainly: I screwed up.

This LessWrong Survey had the lowest turnout since Scott's original survey in 2009. I'll admit I'm not entirely sure why that is, but I have a hunch and most of the footprints lead back to me. The causes I can finger seem to be the diaspora, poor software, poor advertising, and excessive length.

The Diaspora

As it stands, this years LessWrong survey got about 300 completed responses. This can be compared with the previous one in 2016 which got over 1600. I think one critical difference between this survey and the last was its name. Last year the survey focused on figuring out where the 'Diaspora' was and what venues had gotten users now that LessWrong was sort of the walking dead. It accomplished that well I think, and part of the reason why is I titled it the LessWrong Diaspora Survey. That magic word got far off venues to promote it even when I hadn't asked them to. The survey was posted by Scott Alexander, Ozy Frantz, and others to their respective blogs and pretty much everyone 'involved in LessWrong' to one degree or another felt like it was meant for them to take. By contrast, this survey was focused on LessWrong's recovery and revitalization, so I dropped the word Diaspora from it and this seemed to have caused a ton of confusion. Many people I interviewed to ask why they hadn't taken the survey flat out told me that even though they were sitting in a chatroom dedicated to SSC, and they'd read the sequences, the survey wasn't about them because they had no affiliation with LessWrong. Certainly that wasn't the intent I was trying to communicate.

Poor Software

When I first did the survey in 2016, taking over from Scott I faced a fairly simple problem: How do I want to host the survey? I could do it the way Scott had done it, using Google Forms as a survey engine, but this made me wary for a few reasons. One was that I didn't really have a Google account set up that I'd feel comfortable hosting the survey from, another was that I had been unimpressed with what I'd seen from the Google Forms software up to that point in terms of keeping data sanitized on entry. More importantly, it kind of bothered me that I'd be basically handing your data over to Google. This dataset includes a large number of personal questions that I'm not sure most people want Google to have definitive answers on. Moreover I figured: Why the heck do I need Google for this anyway? This is essentially just a webform backed by a datastore, i.e some of the simplest networking technology known to man in 2016. But I didn't want to write it myself, didn't need to write it myself this is the sort of thing there should be a dozen good self hosted solutions for.

There should be, but there's really only LimeSurvey. If I had to give this post an alternate title, it would be "LimeSurvey: An anti endorsement".

I could go on for pages about what's wrong with LimeSurvey, but it can probably be summed up as "the software is bloated and resists customization". It's slow, it uses slick graphics but fails to entirely deliver on functionality, its inner workings are kind of baroque, it's the sort of thing I probably should have rejected on principle and written my own. However at that time the survey was incredibly overdue, so I felt it would be better to just get out something expedient since everyone was already waiting for it anyway. And the thing is, in 2016 it went well. We got over 3000 responses including both partial and complete. So walking away from that victory and going into 2017, I didn't really think too hard about the choice to continue using it.

A couple of things changed between 2016 and our running the survey in 2017:

Hosting - My hosting provider, a single individual who sets up strong networking architectures in his basement, had gotten a lot busier since 2016 and wasn't immediately available to handle any issues. The 2016 survey had a number of birthing pains, and his dedicated attention was part of the reason why we were able to make it go at all. Since he wasn't here this time, I was more on my own in fixing things.

Myself - I had also gotten a lot busier since 2016. I didn't have nearly as much slack as I did the last time I did it. So I was sort of relying on having done the whole process in 2016 to insulate me from opening the thing up to a bunch of problems.

Both of these would prove disastrous, as when I started the survey this time it was slow, it had a variety of bugs and issues I had only limited time to fix, and the issues just kept coming, even more than in 2016 like it had decided now when I truly didn't have the energy to spare was when things should break down. These mostly weren't show stopping bugs though, they were minor annoyances. But every minor annoyance reduced turnout, and I was slowly bleeding through the pool of potential respondents by leaving them unfixed.

The straw that finally broke the camels back for me was when I woke up to find that this message was being shown to most users coming to take the survey:

Message Shown To Survey Respondents Telling Them Their Responses 'cannot be saved'.

"Your responses cannot be saved"? This error meant for when someone had messed up cookies was telling users a vicious lie: That the survey wasn't working right now and there was no point in them taking it.

Looking at this in horror and outrage, after encountering problem after problem mixed with low turnout, I finally pulled the plug.

Poor Advertising

As one email to me mentioned, the 2017 survey didn't even get promoted to the main section of the LessWrong website. This time there were no links from Scott Alexander, nor the myriad small stakeholders that made it work last time. I'm not blaming them or anything, but as a consequence many people who I interviewed to ask about why they hadn't taken the survey had not even heard it existed. Certainly this had to have been significantly responsible for reduced turnout compared to last time.

Excessive Length

Of all the things people complained about when I interviewed them on why they hadn't taken the survey, this was easily the most common response. "It's too long."

This year I made the mistake of moving back to a single page format. The problem with a single page format is that it makes it clear to respondents just how long the survey really is. It's simply too long to expect most people to complete it. And before I start getting suggestions for it in the comments, the problem isn't actually that it needs to be shortened, per se. The problem is that to investigate every question we might want to know about the community, it really needs to be broken into more than one survey. Especially when there are stakeholders involved who would like to see a particular section added to satisfy some questions they have.

Right now I'm exploring the possibility of setting up a site similar to yourmorals so that the survey can be effectively broken up and hosted in a way where users can sign in and take different portions of it at their leisure. Further gamification could be added to help make it a little more fun for people. Which leads into...

The Survey Is Too Much Work For One Person

What we need isn't a guardian of the survey, it's really more like a survey committee. I would be perfectly willing (and plan to) chair such a committee, but I frankly need help. Writing the survey, hosting it without flaws, theming it so that it looks nice, writing any new code or web things so that we can host it without bugs, comprehensively analyzing the thing, it's a damn lot of work to do it right and so far I've kind of been relying on the generosity of my friends for it. If there are other people who really care about the survey and my ability to do it, consider this my recruiting call for you to come and help. You can mail me here on LessWrong, post in the comments, or email me at jd@fortforecast.com. If that's something you would be interested in I could really use the assistance.

What Now?

Honestly? I'm not sure. The way I see it my options look something like:

Call It A Day And Analyze What I've Got - N=300 is nothing to sneeze at, theoretically I could just call this whole thing a wash and move on to analysis.

Try And Perform An Emergency Migration - For example, I could try and set this up again on Google Forms. Having investigated that option, there's no 'import' button on Google forms so the survey would need to be reentered manually for all hundred-and-a-half questions.

Fix Some Of The Errors In LimeSurvey And Try Again On Different Hosting - I considered doing this too, but it seemed to me like the software was so clunky that there was simply no reasonable expectation this wouldn't happen again. LimeSurvey also has poor separation between being able to edit the survey and view the survey results, I couldn't delegate the work to someone else because that could theoretically violate users privacy.

These seem to me like the only things that are possible for this survey cycle, at any rate an extension of time would be required for another round. In the long run I would like to organize a project to write a new software from scratch that fixes these issues and gives us a site multiple stakeholders can submit surveys to which might be too niche to include in the current LessWrong Survey format.

I'm welcome to other suggestions in the comments, consider this my SOS.

 

2017 LessWrong Survey

21 ingres 13 September 2017 06:26AM

The 2017 LessWrong Survey is here! This year we're interested in community response to the LessWrong 2.0 initiative. I've also gone through and fixed as many bugs as I could find reported on the last survey, and reintroduced items that were missing from the 2016 edition. Furthermore new items have been introduced in multiple sections and some cut in others to make room. You can now export your survey results after finishing by choosing the 'print my results' option on the page displayed after submission. The survey will run from today until the 15th of October.

You can take the survey below, thanks for your time. (It's back in single page format, please allow some seconds for it to load):

Click here to take the survey

Requesting Questions For A 2017 LessWrong Survey

6 ingres 09 April 2017 12:48AM

It's been twelve months since the last LessWrong Survey, which means we're due for a new one. But before I can put out a new survey in earnest, I feel obligated to solicit questions from community members and check in on any ideas that might be floating around for what we should ask.

The basic format of the thread isn't too complex, just pitch questions. For best chances of inclusion, however, it's best to include:

  • A short cost/benefit analysis of including the question. Keep in mind that some questions are too invasive or embarrassing to be reasonably included. Other questions might leak too many bits. There is limited survey space and some items might be too marginal to include at the cost of others.
  • An example of a useful analysis that could be done with this question(s), especially interesting analysis in concert with other questions. eg. It's best to start with a larger question like "how does parental religious denomination affect the cohorts current religion?" and then translate that into concrete questions about religion.
  • Some idea of how the question can be done without using write-ins. Unfortunately write-in questions add massive amounts of man-hours to the total analysis time for a survey and make it harder to get out a final product when all is said and done.

The last survey included 148 questions; some sections will not be repeated in the 2017 survey, which gives us an estimate about our question budget. I would prefer to not go over 150 questions, and if at all possible come in at many fewer than that. Removed sections are:

  • The Basilisk section on the last survey provided adequate information on the phenomena it was surveying, and I do not currently plan to include it again on the 2017 survey. This frees up six questions.
  • The LessWrong Feedback portion of the last survey also adequately provided information, and I would prefer to replace it on the 2017 survey with a section measuring the site's recovery, if any. This frees up 19 questions.

I also plan to do significant reform to multiple portions of the survey. I'm particularly interested in making changes to:

  • The politics section. In particular I would like to update the questions about feelings on political issues with new entries and overhaul some of the options on various questions.
  • I handled the calibration section poorly last year, and would like to replace it this year with an easily scored set of questions. To be more specific, a good calibration section should:
    • Good calibration questions should be fermi estimable with no more than a standard 5th grade education. They should not rely on particular hidden knowledge or overly specific information. eg. "Who wrote the foundation novels?" is a terrible calibration question and "What is the height of the Eiffel Tower in meters within a multiple of 1.5?" is decent.
    • Good calibration questions should have a measurable distance component, so that even if an answer is wrong (as the vast majority of answers will be) it can still be scored.
    • A measure of distance should get proportionately smaller the closer an answer is to being correct and proportionately larger the further it is from being correct.
    • It should be easily (or at least sanely) calculable by programmatic methods.
  • The probabilities section is probably due for some revision, I know in previous years I haven't even answered it because I found the wording of some questions too confusing to even consider.

So for maximum chances of inclusion, it would be best to keep these proposed reforms in mind with your suggestions.

(Note: If you have suggestions on questions to eliminate, I'd be glad to hear those too.)

Inbox zero - A guide - v2 (Instrumental behaviour)

2 Elo 11 March 2017 09:34AM

This post is modified from the original.

Original post: Instrumental behaviour: Inbox zero - A guide

This version was first posted at: http://bearlamp.com.au/instrumental-behaviour-inbox-zero-a-guide-v2/


This will be brief.

Inbox zero is a valuable thing to maintain.  Roughly promoted around the web as having an empty inbox.

An email inbox collects a few things:

  • junk
  • automatic mail sent to you
  • personal mail sent to you
  • work sent to you
  • (maybe - work you send to yourself because that's the best way to store information for now)

An inbox is a way to keep track of "how much I have to do yet".  But that's not really what it is.  Somewhere along the lines from "I will send via courier a hand scribed letter to yonder", became newsletters, essays, spam, and many more things mixed together.  Because of this; iit'st's pretty hard to tell how much work is really in an inbox.  Is it 5 minutes to read this one, or do I have to write an essay back?  It's pretty important to be in understanding of what volume of work awaits you.  The trick to doing this is doing the incredibly valuable task of getting to inbox zero.  

The basic philosophy is that a full inbox and unread emails are not a good place to be keeping at bay the unknowns of "how much work I have to do".  Instead; other lists, folders, or organisation systems are better at that.  And if you don't already do it; have ONE list (or like, this advice is complicated, there are different types of lists, but if you have more than one of the same type of lists, you are bound to confuddle up your process and end up doing the other ones that you didn't need to do instead of the ones that you did need to do).

This guide is for anyone with bajillions of emails in their inbox, some read; some not.  If you have an email system in place; don't change it.  if not - get one.  (maybe not this one - but do it).


0. decide that this is a good idea (this can be done after) but mostly I want to say - don't half-arse this, you might end up in a no-mans-land between the old and the new.

1. A program.  

I recommend Thunderbird because it's free.  I used to work in a webmail system but the speed of webmail is a joke in comparison to local mail.  also offline-powers are handy from time to time.  (Disadvantage - not always having backups for everything, alternative: IMAP - duplicates online and offline.)

2. Archive system

This being 2017 we are going to make a few main folders.  

  • Old as all hell (or other friendly name)
  • 2015
  • 2016
  • 2017

Anything older than 2014 will probably never get looked at again; (just ask any email veteran) That's okay - that's what archives are for.

3. Old

Put anything old into the old folder

4. 2015

That was two years ago!  it will also go the same way as old-as-all-hell, but for now it can sit in 2015.

5. 2016

two options here - either:

a. leave them in your inbox and through the year sort them into the 2015 folder; remembering that things that old should go to sleep easy.

b. put them in 2016 where you can look at them when you need them.

6. 2017

There are a few simple behaviours that make the ongoing use of the system handy.

a. if you read a thing, and you have no more to do with it; file it away into 2017

b. if you read a thing and still have more to do; leave it in the inbox (If you can resolve it in under 5 minutes; try to do it now)

c. if you don't plan to read a thing AND it's not important AND you don't want to delete it; I strongly advise unsubscribing from the source; finding a way to stop them from coming in, or setting up a rule to auto-sort into a folder. (or set up a second email address for signing up to newsletters)

d.  Every automatically generated email has an unsubscribe button at the bottom.  If you have a one-time unsubscribe policy you will never have to see the same junk twice.

e. do some work; answer emails; send other emails etc.  and file things as you go.

f. mammoth - these emails are huge-ass things.  they are the result of a days worth of work to do, and send back the results.  Don't leave them in the inbox.  Something that big belongs on a serious to-do list.  You can generate other folders.  Including a folder for those juggling balls that are up in the air, waiting for the replies to come back, as well as mammoths, and a folder for emails from mum that you can't delete but you also can't quite file.

7. other email folders

sure sometimes things need a bit of preserving; sometimes things need sorting - go ahead and do that.  Don't let me stop you.

Using this fairly ordinary system I can get my total email time down to about half an hour a week.

Don't like it? find a better system.  But don't leave them all there.

Final note: I have an email address for things I subscribe to that is separate to the email address I give out or use; this way I can check my subscriptions quickly without mixing them up with work/life/important things.


This post came out of a discussion in the IRC.  It took 30mins to write.  This was written with no research and there are likely better systems in existence.  It partially incorporates a "Getting Things Done" attitude but I might post more about that soon.

Feel free to share your system in the comments, or suggest improvements.