You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Slackson comments on Open Thread for February 11 - 17 - Less Wrong Discussion

3 Post author: Coscott 11 February 2014 06:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: ricketybridge 12 February 2014 02:40:07AM 4 points [-]

Sometimes I feel like looking into how I can help humanity (e.g. 80000 hours stuff), but other times I feel like humanity is just irredeemable and may as well wipe itself off the planet (via climate change, nuclear war, whatever).

For instance, humans are so facepalmingly bad at making decisions for the long term (viz. climate change, running out of fossil fuels) that it seems clear that genetic or neurological enhancements would be highly beneficial in changing this (and other deficiencies, of course). Yet discourse about such things is overwhelmingly negative, mired in what I think are irrational kneejerk reactions to defend "what it means to be human." So I'm just like, you know what? Fuck it. You can't even help yourselves help yourselves. Forget it.

Thoughts?

Comment author: Slackson 12 February 2014 03:25:44AM 2 points [-]

I can't speak for you, but I would hugely prefer for humanity to not wipe itself out, and even if it seems relatively likely at times, I still think it's worth the effort to prevent it.

If you think existential risks are a higher priority than parasite removal, maybe you should focus your efforts on those instead.

Comment author: ricketybridge 12 February 2014 07:42:09AM -1 points [-]

Serious, non-rhetorical question: what's the basis of your preference? Anything more than just affinity for your species?

I'm not 100% sure what you mean by parasite removal... I guess you're referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.

Comment author: Slackson 12 February 2014 08:49:54AM 4 points [-]

If I live forever, through cryonics or a positive intelligence explosion before my death, I'd like to have a lot of people to hang around with. Additionally, the people you'd be helping through EA aren't the people who are fucking up the world at the moment. Plus there isn't really anything directly important to me outside of humanity.

Parasite removal refers to removing literal parasites from people in the third world, as an example of one of the effective charitable causes you could donate to.

Comment author: ricketybridge 12 February 2014 09:23:06PM 0 points [-]

EA? (Sorry to ask, but it's not in the Less Wrong jargon glossary and I haven't been here in a while.)

Parasite removal refers to removing literal parasites from people in the third world

Oh. Yes. I think that's important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn't flawless, I may end up focusing on something more immediate. Not sure yet.

Comment author: Emile 12 February 2014 10:15:42PM 4 points [-]

EA is Effective Altruism.

Comment author: ricketybridge 12 February 2014 10:31:10PM 0 points [-]

Ah, thanks. :)