Comment author: Viliam_Bur 12 February 2014 09:50:30AM 4 points [-]

If you think helping humanity is (in long term) a futile effort, because humans are so stupid they will destroy themselves anyway... I'd say the organization you are looking for is CFAR.

So, how would you feel about making a lot of money and donating to CFAR? (Or other organization with a similar mission.)

Comment author: ricketybridge 12 February 2014 09:09:30PM *  2 points [-]

How cool, I've never heard of CFAR before. It looks awesome. I don't think I'm capable of making a lot of money, but I'll certainly look into CFAR.

Edit: I just realized that CFAR's logo is at the top of the site. Just never looked into it. I am not a smart man.

Comment author: skeptical_lurker 12 February 2014 03:43:26PM 0 points [-]

Well, there has not been a nuclear war yet (excluding WWII where deaths from nuclear weapons were tiny in proportion), climate change has only been a known risk for a few decades, and progress is being made with electric cars and solar power. Things could be worse. Instead of moaning, propose solutions : what would you do to stop global warming when so much depends on fossil fuels?

On a separate note, I agree with the kneejerk reactions, but its a temporary cultural thing, caused partially by people basing morality on fiction. Get one group of people to watch GATTACA and another to watch Ghost in the shell, and they would have very different attitudes towards transhumanism. More interestingly, cybergoths (people who like to dress as cyborgs as a fashion statement) seem to be pretty open to discussions of actual brain-computer interfaces and there is music with H+ lyrics being realeased on actual record lables and brought by people who like the music and are not transhumanists... yet.

In conclusion, once enhancement become possible I think there will be a sizeable minority of people who back it - in fact this has allready happend with modafinil and students.

Comment author: ricketybridge 12 February 2014 08:57:03PM 3 points [-]

people basing morality on fiction.

Yes, and that seems truly damaging. I get the need to create conflict in fiction, but it seems to come always at the expense of technological progress, in a way I've never really understood. When I read Brave New World, I genuinely thought it truly was a "brave new world." So what if some guy was conceived naturally?? Why is that inherently superior? Sounds like status quo bias, if you ask me. Buncha Luddite propraganda.

I've actually been working on a pro-technology, anti-Luddite text-based game. Maybe working on it is in fact a good idea towards balancing out the propaganda and changing public opinion...

Comment author: RomeoStevens 12 February 2014 08:22:56AM 5 points [-]

https://en.wikipedia.org/wiki/Identifiable_victim_effect

Also, would you still want to save a drowning dog even if it might bite you out of fear and misunderstanding? (let's say it is a small dog and a bite would not be drastically injurious)

Comment author: ricketybridge 12 February 2014 08:32:13AM 0 points [-]

https://en.wikipedia.org/wiki/Identifiable_victim_effect

True, true. But it's still hard for me (and most people?) to circumvent that effect, even while I'm aware of it. I know Mother Theresa actually had a technique for it (to just think of one child rather than the millions in need). I guess I can try that. Any other suggestions?

Also, would you still want to save a drowning dog even if it might bite you out of fear and misunderstanding? (let's say it is a small dog and a bite would not be drastically injurious)

I'll pretend it's a cat since I don't really like small dogs. ;-) Yes, of course I'd save it. I think this analogy will help me moving forward. Thank you! ^_^

Comment author: D_Malik 12 February 2014 06:08:31AM *  0 points [-]

[deleted]

Comment author: ricketybridge 12 February 2014 07:43:19AM 0 points [-]

Well, true. All things shall pass.

Comment author: Slackson 12 February 2014 03:25:44AM 2 points [-]

I can't speak for you, but I would hugely prefer for humanity to not wipe itself out, and even if it seems relatively likely at times, I still think it's worth the effort to prevent it.

If you think existential risks are a higher priority than parasite removal, maybe you should focus your efforts on those instead.

Comment author: ricketybridge 12 February 2014 07:42:09AM -1 points [-]

Serious, non-rhetorical question: what's the basis of your preference? Anything more than just affinity for your species?

I'm not 100% sure what you mean by parasite removal... I guess you're referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.

Comment author: Vladimir_Nesov 12 February 2014 03:30:41AM 7 points [-]

A task with a better expected outcome is still better (in expected outcome), even if it's hopeless, silly, not as funny as some of the failure modes, not your responsibility or in some way emotionally less comfortable.

Comment author: ricketybridge 12 February 2014 07:31:58AM 0 points [-]

You're of course correct. I'm tempted to question the use of "better" (i.e. it's a matter of values and opinion as to whether its "better" if humanity wipes itself out or not), but I think it's pretty fair to assume (as I believe utilitarians do) that less suffering is better, and theoretically less suffering would result from better decision-making and possibly from less climate change.

Thanks for this.

Comment author: CellBioGuy 12 February 2014 02:49:10AM 1 point [-]

I find it fascinating to observe.

Comment author: ricketybridge 12 February 2014 07:23:11AM 0 points [-]

I assume you're talking about the facepalm-inducing decision-making? If so, that's a pretty morbid fascination. ;-)

Comment author: Vladimir_Nesov 12 February 2014 03:08:19AM *  0 points [-]

I tried to read Spivak's Calc once and didn't really like it much; I'm not sure why everyone loves it. Maybe it gets better as you go along, idk.

"Not liking" is not very specific. It's good all else equal to "like" a book, but all else is often not equal, so alternatives should be compared from other points of view as well. It's very good for training in rigorous proofs at introductory undergraduate level, if you do the exercises. It's not necessarily enjoyable.

I've also been reading his book and just started Hoffman & Kunze's Linear Algebra, which supposedly has a bit more theory

It's a much more advanced book, more suitable for a deeper review somewhere at the intermediate or advanced undergraduate level. I think Axler's "Linear Algebra Done Right" is better as a second linear algebra book (though it's less comprehensive), after a more serious real analysis course (i.e. not just Spivak) and an intro complex analysis course.

Comment author: ricketybridge 12 February 2014 07:22:16AM *  1 point [-]

Oh yeah, I'm not saying Spivak's Calculus doesn't provide good training in proofs. I really didn't even get far enough to tell whether it did or not, in which case, feel free to disregard my comment as uninformed. But to be more specific about my "not liking", I just found the part I did read to be more opaque than engaging or intriguing, as I've found other texts (like Strang's Linear Algebra, for instance).

Edit: Also, I'm specifically responding to statements that I thought referring to liking the book in the enjoyment sense (expressed on this thread and elsewhere as well). If that's not the kind of liking they meant, then my comment is irrelevant.

It's a much more advanced book, more suitable for a deeper review somewhere at the intermediate or advanced undergraduate level. I think Axler's "Linear Algebra Done Right" is better as a second linear algebra book (though it's less comprehensive), after a more serious real analysis course (i.e. not just Spivak) and an intro complex analysis course.

Damn, really?? But I hate it when math books (and classes) effectively say "assume this is true" rather than delve into the reason behind things, and those reasons aren't explained until 2 classes later. Why is it not more pedagogically sound to fully learn something rather than slice it into shallow, incomprehensible layers?

Comment author: jaibot 12 February 2014 06:39:38AM 22 points [-]

You know how when you see a kid about to fall off a cliff, you shrug and don't do anything because the standards of discourse aren't as high as they could be?

Me neither.

Comment author: ricketybridge 12 February 2014 07:08:33AM 5 points [-]

lol yeah, I know what you're talking about.

Okay okay, fine. ;-)

Comment author: Lumifer 12 February 2014 02:39:28AM 1 point [-]

Yes, I've heard this before

It seems you're playing a "Yes, but" game. I am sure you can win it, do you really want to?

Comment author: ricketybridge 12 February 2014 02:54:21AM 0 points [-]

Yes. :)

But...

;-)

You should check out my response to one of the other comments--I think it's even more "yes, but"! I kind of see what you mean, but it sounds to me like just a way of saying "believe x or else" instead of giving an actual argument.

However, the ultimate conclusion is, I guess, just getting back on the horse and doing whatever I can to treat the dysthymia. I'm just like... ugh. :P But that's not very rational.

Thanks for the feedback.

View more: Prev | Next