In response to Proposal: LW courses
Comment author: djm 05 March 2014 10:20:07PM 3 points [-]

I agree - external deadlines help in many cases and it would be interesting to try this out.

In an ideal world I'd love to see a series of MOOC's on areas such as FAI, rationality, etc though I imagine working out an effective way to test some of the topics would be a challenge. Perhaps as a starting point someone could make a couple of short polls in the guise of a quiz and use voting as the answer?

In response to comment by djm on Proposal: LW courses
Comment author: Slackson 07 March 2014 12:47:36AM 0 points [-]

What are the options for free MOOC platforms these days? Moodle's the only one that comes to mind, and it's not optimized for MOOCs.

Comment author: therufs 06 March 2014 03:39:55PM 0 points [-]

Based on ill-remembered citations of the efficacy of exercise for improving focus and general mental health, and after a lot of angst about body acceptance, I I reduced trivial inconveniences to working out below the inertia setpoint and started jogging three days a week. (I settled on running after an extended period of never getting around to signing up for hot yoga, crossfit, or a membership to the Y so I could swim, all of which seem more appealing.)

Good outcomes so far: feeling of accomplishment post-workout; feeling of accomplishment when I put on shoes and leave the house (remembering that not long ago I was basically incapable of making myself do anything I found unsavory); getting a lot less winded by minor physical exertion (e.g. walking briskly up a hill or flight of stairs). Meta-good-outcome: practice at finding and focusing on successes for self-motivation.

Waiting for more data: My focus has not yet improved discernibly. To do: self-test whether focus improves globally if I focus on jogging while I'm doing it.

Comment author: Slackson 06 March 2014 09:23:16PM 1 point [-]

How do you plan to measure focus? Just subjective effects, or are you using QuantifiedMind, or pomodoro success rate, or something?

Comment author: Brillyant 06 March 2014 05:22:06PM 15 points [-]

Of the 12 most recent posts in 'Discussion', nine are 'Meetup' related and one is a meta-level discussion about producing LW courses.

Is LW imploding into some sort of self-impressed death spiral? Where is the new non-LW meta content? Am I way off, or has the quality of posts significantly diminished over time here?

I'm curious to know what others think.

Comment author: Slackson 06 March 2014 08:32:29PM 2 points [-]

More meetup posts clutter Discussion (which is kinda bad) but mean that people are actually going to meetup groups (which is kinda awesome). Maybe frame a meetup post not as a trivial inconvenience, but evidence that rationalists are meeting in person and having cool discussions and working on their lives instead of hanging around in Less Wrong.

When there's a lot of interesting content here, sometimes people ask why we're all sticking around talking about talking about rationality instead of doing stuff out in the world.

Meetup : Auckland Preliminary Meetup

1 Slackson 06 March 2014 04:31AM

Discussion article for the meetup : Auckland Preliminary Meetup

WHEN: 08 March 2014 02:00:00PM (+1300)

WHERE: Albert Park, Auckland

I got back from the second Melbourne CFAR workshop recently and it was good. It's well worthwhile having a local rationalist community and while there are some good thinkers in my immediate circle of friends, meeting more and learning from each other would be awesome.

I'm not sure if others will be using it, but let's meet near the gazebo in Albert Park at 2pm Saturday. I'll be carrying my CFAR bag and water bottle if you want to come over and say hi. I would be wearing a "Just shy, not antisocial" shirt if I had one. If you're interested in coming, just comment, or come anyway.

Cheers, Marcel.

Discussion article for the meetup : Auckland Preliminary Meetup

Comment author: Coscott 13 February 2014 01:14:45AM 3 points [-]

Overlapping threads are just an agreement of everyone to keep looking at old posts. This is harder to orchestrate, because it requires lots of people to change.

Comment author: Slackson 13 February 2014 01:46:31AM 0 points [-]

Point, but I did suggest several ways in which this could be encouraged (pinned threads, different stated lifespans, shared use of Latest Open Thread feed)

Reducing the visibility of the new threads could help too.

Comment author: jkaufman 12 February 2014 11:16:48PM 5 points [-]

If the problem is that no one will post in an open thread near the end of it's lifetime, a silly solution would be to automatically create an open thread every day with 10% probability. Now someone considering posting has no reason to wait for a future day to post because the expected lifetime of a thread is always constant, and is ~10 days.

Better would be to have a single thread "Open Thread" where posts older than N days would be moved to an automatically created "Open Thread Date-Date" post.

An even easier fix would be to remove the end date from open thread titles. When someone feels like posting a new one they just do that. This is a sloppy implementation of the first solution.

Comment author: Slackson 13 February 2014 01:07:30AM 0 points [-]

How about overlapping thread lifespans? This way when a new thread is created, recent comments on the previous thread won't go unread, and discussion can still happen there. A thread on Monday that lasts a week and a thread on Thursday does too, for example, with both threads pinned to the top and included under the Latest Open Thread feed on the side. I suspect this would be easier to implement than your second option. It's more difficult to implement than your first and third options, though.

Comment author: ricketybridge 12 February 2014 07:42:09AM -1 points [-]

Serious, non-rhetorical question: what's the basis of your preference? Anything more than just affinity for your species?

I'm not 100% sure what you mean by parasite removal... I guess you're referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.

Comment author: Slackson 12 February 2014 08:49:54AM 4 points [-]

If I live forever, through cryonics or a positive intelligence explosion before my death, I'd like to have a lot of people to hang around with. Additionally, the people you'd be helping through EA aren't the people who are fucking up the world at the moment. Plus there isn't really anything directly important to me outside of humanity.

Parasite removal refers to removing literal parasites from people in the third world, as an example of one of the effective charitable causes you could donate to.

Comment author: ricketybridge 12 February 2014 02:40:07AM 4 points [-]

Sometimes I feel like looking into how I can help humanity (e.g. 80000 hours stuff), but other times I feel like humanity is just irredeemable and may as well wipe itself off the planet (via climate change, nuclear war, whatever).

For instance, humans are so facepalmingly bad at making decisions for the long term (viz. climate change, running out of fossil fuels) that it seems clear that genetic or neurological enhancements would be highly beneficial in changing this (and other deficiencies, of course). Yet discourse about such things is overwhelmingly negative, mired in what I think are irrational kneejerk reactions to defend "what it means to be human." So I'm just like, you know what? Fuck it. You can't even help yourselves help yourselves. Forget it.

Thoughts?

Comment author: Slackson 12 February 2014 03:25:44AM 2 points [-]

I can't speak for you, but I would hugely prefer for humanity to not wipe itself out, and even if it seems relatively likely at times, I still think it's worth the effort to prevent it.

If you think existential risks are a higher priority than parasite removal, maybe you should focus your efforts on those instead.

Comment author: Slackson 11 February 2014 11:24:52AM 4 points [-]

Implicit-association tests are handy for identifying things you might not be willing to admit to yourself.

Comment author: Slackson 10 February 2014 08:57:52AM *  6 points [-]

Once EA is a popular enough movement that this begins to become an issue, I expect communication and coordination will be a better answer than treating this like a one-shot problem. Maybe we'll end up with meta-charities as the equivalent of index funds, that diversify altruism to worthy causes without saturating any given one. Maybe the equivalent of GiveWell.org at the time will include estimated funding gaps for their recommended charities, and track the progress, automatically sorting based on which has the largest funding gap and the greatest benefit.

I doubt that at any point it will make sense for individuals should be personally choosing, ranking, and donating their own money to charities as if they're choosing the ratios for everyone TDT-style, not least because of the unnecessary redundancy.

EDIT: Upvoted because it is a valid concern. The AMF reached saturation relatively quickly, and may have exceeded the funding it needed. I just disagree with the efficiency of this particular solution to the problem.

View more: Prev | Next