Comment author: Dustin 08 June 2015 08:15:27PM 11 points [-]

A lot of us are on Tumblr now; I've made a few blog posts at the much more open group blog Carcinisation, there's a presence on Twitter, and a lot of us just have made social friendships with enough other rationalists that the urge to post for strangers has a pressure release valve in the form of discussing whatever ideas with the contents of one's living room or one's Facebook friends.

I don't like this.

I do not have the time to engage in the social interactions required to even be aware of where all this posting elsewhere is going on, but I want to read it. I've been regularly reading OB/LW since before LW existed and this diaspora makes me feel left behind.

Comment author: Gondolinian 08 June 2015 08:26:53PM *  2 points [-]

I do not have the time to engage in the social interactions required to even be aware of where all this posting elsewhere is going on, but I want to read it.

There's a Masterlist for rational Tumblr, but I'm not aware of a complete list of all rationalist blogs across platforms.

Perhaps the Less Wrong community might find it useful to start one? If it were hosted here on LW, it might also reinforce LW's position as a central hub of the rationality community, which is relevant to the OP.

Comment author: [deleted] 08 June 2015 07:44:40PM 8 points [-]

Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.

It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and ~~reviewed~~ skewered accordingly :-p.

Comment author: Gondolinian 08 June 2015 08:22:14PM *  6 points [-]

It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and ~~reviewed~~ skewered accordingly :-p.

I agree. There are definitely times for unfiltered criticism, but most people require a feeling of security to be their most creative.

Comment author: John_Maxwell_IV 08 June 2015 07:30:07PM *  23 points [-]

I've previously talked about how I think Less Wrong's culture seems to be on a gradual trajectory towards posting less stuff and posting it in less visible places. For example, six years ago a post like this qualified as a featured post in Main. Nowadays it's the sort of thing that would go in an Open Thread. Vaniver's recent discussion post is the kind of thing that would have been a featured Main post in 2010.

Less Wrong is one of the few forums on the internet that actually discourages posting content. This is a feature of the culture that manifests in several ways:

  • One of the first posts on the site explained why it's important to downvote people. The post repeatedly references experiences with Usenet to provide support for this. But I think the internet has evolved a lot since Usenet. Subtle site mechanics have the potential to affect the culture of your community a lot. (I don't think it's a coincidence that Tumblr and 4chan have significantly different site mechanics and also significantly different cultures and even significantly different politics. Tumblr's "replies go to the writer's followers" mechanic leads to a concern with social desirability that 4chan's anonymity totally lacks.)

  • On reddit, if your submission is downvoted, it's downvoted in to obscurity. On Less Wrong, downvoted posts remain on the Discussion page, creating a sort of public humiliation for people who are downvoted.

  • The Main/Discussion/Open Thread distinction invites snippy comments about whether your thing would have been more appropriate for some other tier. On most social sites, readers decide how much visibility a post should get (by upvoting, sharing, etc.) Less Wrong is one of the few that leaves it down to the writer. This has advantages and disadvantages. One advantage is that important but boring scholarly work can get visibility more easily.

  • Upvotes substitute for praise: instead of writing "great post" type comments, readers will upvote you, which is less of a motivator.

My experience of sitting down to write a Less Wrong post is as follows:

  1. I have some interesting idea for a Less Wrong post. I sit down and excitedly start writing it out.

  2. A few paragraphs in, I think of some criticism of my post that users are likely to make. I try to persevere for a while anyway.

  3. Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.

Contrast the LW model with the "conversational blogging" model where you sit down, scribble some thoughts out, hit post, and see what your readers think. Without worrying excessively about what readers think, you're free to write in open mode and have creative ideas you wouldn't have when you're feeling self-critical.

Anyway, now that I've described the problem, here are some offbeat solution ideas:

  • LW users move away from posting on LW and post on Medium.com instead. There aren't upvotes or downvotes, so there's little fear of being judged. Bad posts are "punished" by being ignored, not downvoted. And Medium.com gives you a built-in audience so you don't need to build up a following the way you would with an independent blog. (I haven't actually used Medium.com that much; maybe it has problems.)

  • The EA community pays broke postdocs to create peer-reviewed, easily understandable blog posts on topics of interest to the EA community at large (e.g. an overview of the literature on how to improve the quality of group discussions, motivation hacking, rationality stuff, whatever). This goes on its own site. After establishing a trusted brand, we could branch out in to critiquing science journalism in order to raise the sanity waterline or other cool stuff like that.

  • Someone makes it their business to read everything gets written on every blog in the EA-sphere and create a "Journal of Effective Altruism" that's a continually updated list of links to the very best writing in the EA-sphere. This gives boring scholarly stuff a chance to get high visibility. This "Editor-in-Chief" figure could also provide commentary, link to related posts that they remember, etc. I'll bet it wouldn't be more than a part-time job. Ideally it would be a high status, widely trusted person in the EA community who has a good memory for related ideas.

Some of these are solutions that make more sense if the EA movement grows significantly beyond its current scope, but it can't hurt to start kicking them around.

The top tier quality for actually read posting is dominated by one individual (a great one, but still)

Are we talking about LW proper here? Arguably this has been true over a good chunk of the site's history: at one time it was Eliezer, then Yvain, then Lukeprog, etc.

Comment author: Gondolinian 08 June 2015 07:46:55PM 4 points [-]

Is anyone in favor of creating a new upvote-only section of LW?

Submitting...

Comment author: Gondolinian 08 June 2015 07:14:54PM *  3 points [-]

A few tangential ideas off the top of my head:

If the moderation and self selection of Main was changed into something that attracts those who have been on LW for a long time, and discussion was changed to something like Newcomers discussion, LW could go back to being the main space, with a two tier system (maybe one modulated by karma as well).

  1. People have been proposing for a while that we create a third section of LW for open threads and similar content.

  2. We could have a section without any karma scores for posts/upvote only, though we could still keep the same system for comments.

  3. We could allow Discussion posts to be Promoted while still using the Discussion karma system.

  4. We could have Promotion somehow be based on popular vote (not necessarily karma), instead of a moderator's judgement.

Open Thread, Jun. 8 - Jun. 14, 2015

4 Gondolinian 08 June 2015 12:04AM

If it's worth saying, but not worth its own post (even in Discussion), then it goes here.


Notes for future OT posters:

1. Please add the 'open_thread' tag.

2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)

3. Open Threads should be posted in Discussion, and not Main.

4. Open Threads should start on Monday, and end on Sunday.

Comment author: Lumifer 05 June 2015 05:06:19PM *  2 points [-]

they would be essentially blank slates

I don't think this is how it works with people. Especially smart ones with full 'net access.

Comment author: Gondolinian 05 June 2015 05:22:09PM *  2 points [-]

I don't think this is how it works with people. Especially ones with full 'net access.

You're right; that was poorly phrased. I meant that they would have a lot less tying them down to the mainstream, like heavy schoolwork, expectations to get a good job, etc. Speaking from my own experience, not having those makes a huge difference in what ideas you're able to take seriously.

The Internet exposes one to many ideas, but 99% of them are nonsense, and smart people with the freedom to think about the things they want to think about eventually become pretty good at seeing that (again speaking from personal experience), so I think Internet access helps rather than hurts this "blank slate"-ness.

Comment author: turchin 03 June 2015 12:42:09PM *  3 points [-]

If we have 200-300 years before well proved catastrophe, this technic may work. But in 10-50 years time scale it is better to search good clever students and pay them to work on x-risks.

Comment author: Gondolinian 05 June 2015 04:47:04PM *  0 points [-]

If we have 200-300 years before well proved catastrophe, this technic may work.

If you're talking about significant population changes in IQ, then I agree, it would take a while to make that happen with only reproduction incentives. However, I was thinking more along the lines of just having a few thousands or tens of thousands of >145 IQ people more than we would otherwise, and that could be achieved in as little as one or two generations (< 50 years) if the program were successful enough.

Now for a slightly crazier idea. (Again, I'm just thinking out loud.) You take the children and send them to be unschooled by middle-class foster families, both to save money, and to make sure they are not getting the intellectual stimulation they need from their environment alone, which they might if you sent them to upper-class private schools, for example. But, you make sure they have Internet access, and you gradually introduce them to appropriately challenging MOOCs on math and philosophy specially made for them, designed to teach them a) the ethics of why they should want to save the world (think some of Nate's posts) and b) the skills they would need to do it (e.g., they should be up to speed on what MIRI recommends for aspiring AI researchers before they graduate high school).

The point of separating them from other smart people is that smart people tend to be mostly interested in money, power, status, etc., and that could spread to them if they are immersed in it. If their focus growing up is simply to find intellectual stimulation, then they would be essentially blank slates* and when they're introduced to problems that are very challenging and stimulating, have other smart people working on them, and are really, really important, they might be more likely to take them seriously.

*Please see my clarification below.

Comment author: Drahflow 03 June 2015 09:08:14AM 2 points [-]
  • Install a smoke detector

  • Do martial arts training until you get the falling more or less right. While this might be helpful against muggers the main benefit is the reduced probability of injury in various unfortunate situation.

Comment author: Gondolinian 05 June 2015 01:46:06AM 0 points [-]

Do martial arts training until you get the falling more or less right. While this might be helpful against muggers the main benefit is the reduced probability of injury in various unfortunate situation.

As someone with ~3 years of aikido experience, I second this.

Comment author: Gondolinian 05 June 2015 01:05:34AM 0 points [-]

What's the easiest way to put a poll in a top-level article?

Comment author: Gram_Stone 04 June 2015 05:25:27AM 1 point [-]

To elaborate on existing comments, a fourth alternative to FAI theory, Earning To Give, and popularization is strategy research. (That could include research on other risks besides AI.) I find that the fruit in this area is not merely low-hanging but rotting on the ground. I've read in old comment threads that Eliezer and Carl Shulman in particular have done a lot of thinking about strategy but very little of it has been written down, and they are very busy people. Circumstances may well dictate retracing a lot of their steps.

You've said elsewhere that you have a low estimate of your innate mathematical ability, which would preclude FAI research, but presumably strategy research would require lower aptitude. Things like statistics would be invaluable, but strategy research would also involve a lot of comparatively less technical work, like historical and philosophical analysis, experiments and surveys, literature reviews, lots and lots of reading, etc. Also, you've already done a bit of strategizing; if you are fulfilled by thinking about those things and you think your abilities meet the task, then it might be a good alternative.

Some strategy research resources:

Comment author: Gondolinian 04 June 2015 04:04:04PM 1 point [-]

Thanks for taking the time to put all that together! I'll keep it in mind.

View more: Prev | Next