Comment author: DanArmak 13 October 2016 11:19:20PM 6 points [-]

Joi Ito said several things that are unpleasant but are probably believed by most people, and so I am glad for the reminder.

JOI ITO: This may upset some of my students at MIT, but one of my concerns is that it’s been a predominately male gang of kids, mostly white, who are building the core computer science around AI, and they’re more comfortable talking to computers than to human beings. A lot of them feel that if they could just make that science-fiction, generalized AI, we wouldn’t have to worry about all the messy stuff like politics and society. They think machines will just figure it all out for us.

Yes, you would expect non-white, older, women who are less comfortable talking to computers to be better suited dealing with AI friendliness! Their life experience of structural oppression helps them formally encode morals!

ITO: [Temple Grandin] says that Mozart and Einstein and Tesla would all be considered autistic if they were alive today. [...] Even though you probably wouldn’t want Einstein as your kid, saying “OK, I just want a normal kid” is not gonna lead to maximum societal benefit.

I should probably get a good daily reminder most people would not, in fact, want their kid to be as smart, impactful and successful in life as Einstein, and prefer "normal", not-too-much-above-average kids.

Comment author: scarcegreengrass 14 October 2016 02:11:15PM 1 point [-]

Both of those Ito remarks referenced supposedly widespread perspectives. But personally, i have almost never encountered these perspectives before.

Comment author: DanArmak 12 October 2016 03:50:56PM 1 point [-]

Do you know what went wrong or what's the difference in making a working link post?

Comment author: scarcegreengrass 14 October 2016 02:03:01PM 0 points [-]

No, i don't. One possible explanation for the bug is that the successful time i used the dropdown to post the link directly to Discussion, rather than first to Drafts.

Comment author: Lightwave 12 October 2016 04:48:07PM 5 points [-]
Comment author: scarcegreengrass 13 October 2016 11:57:11AM 2 points [-]

Oh, this is much more complete, thanks.

Wow, it's surreal to hear Obama talking about Bostrom, Foom, and biological x risk.

Comment author: DanArmak 12 October 2016 02:59:06PM 1 point [-]

I don't see a link. Was it lost like in my link post on a different subject? I still don't know how to post links correctly.

Comment author: scarcegreengrass 12 October 2016 03:49:07PM 0 points [-]

[Link] Barack Obama's opinions on near-future AI [Fixed]

3 scarcegreengrass 12 October 2016 03:46PM
Comment author: DanArmak 12 October 2016 02:59:06PM 1 point [-]

I don't see a link. Was it lost like in my link post on a different subject? I still don't know how to post links correctly.

Comment author: scarcegreengrass 12 October 2016 03:45:16PM 0 points [-]

What?? Weird!

Maybe it was lost when i edited the draft.

Comment author: scarcegreengrass 12 October 2016 01:45:52PM 0 points [-]

The headline is misleading. I don't think there is an Apollo-style funding plan; i think Obama just thinks it'd be a good idea.

Comment author: username2 10 October 2016 09:23:33AM 6 points [-]

Is there something similar to the Library of Scott Alexandria available for The Last Psychiatrist ? I just read "Amy Schumer offers you a look into your soul" and I really liked it but I don't have enough time to read all posts on the blog.

Comment author: scarcegreengrass 10 October 2016 05:24:47PM *  2 points [-]

This blog is so wordy and cultural that i (unfamiliar with the context) find it actually challenging to figure out what the premise, thesis, or content of the post is. Reminds me of my experience with discovering arcane 'neoreaction' blogs.

Comment author: scarcegreengrass 06 October 2016 04:15:21PM *  -1 points [-]

I would also contemplate the scenario that the human species might turn out to be less impressive than it currently appears, and is actually a fairly typical example of a successful Earth species. Most achievements that distinguish humans from eg plankton are in the future (eg space industry), not the past or present.

This might sound strange. Arguments in favor of this perspective:

• Homo sapiens is not the greatest species in terms of population or total biomass.

• Homo sapiens is not the only species to make tools, use agriculture, build buildings, or adapt to a variety of terrestrial habitats.

• Homo sapiens is not the first species to have a catastrophic impact on the atmosphere.

Arguments against this perspective:

• The human economy is currently doubling in scale every couple decades.

• No species (probably) ever reached the edge of the atmosphere before Homo sapiens.

(To clarify, i think this question is far from settled. But i think the idea that Homo sapiens will be smaller-impact than expected is more likely than the scenario that historical gods are representations of unknown prosperous civilizations.)

Comment author: WhySpace 02 October 2016 05:32:18PM *  2 points [-]

As a side note, this might also be interesting, purely from a utilitarian standpoint. If insect suffering matters, that would completely dwarf all human moral weight, since there are 10^18 of them but only 10^9 of us.

However, perhaps we don't care morally about animals which can't pass the mirror test, on the assumption that this means they have no self-image, and therefore no consciousness. They could feel pain and other stimuli, but there would be no internal observer to notice their own suffering.

If that's the case, animal welfare might still dominate over human welfare, but by a smaller margin. Doing what I described in the previous comment would let us estimate the value of future life in general, if we can determine to within an order of magnitude or so how much we value animals with various traits. This is critical for questions like whether terraforming mars is net positive or net negative.

Comment author: scarcegreengrass 06 October 2016 03:39:44PM 1 point [-]

I actually drew up a spreadsheet to estimate this: https://docs.google.com/spreadsheets/d/1xnfsDuC0ddUxvKekGLJ5QA5nrXxzked7K-k6jqUm538/edit?usp=sharing

I agree with you about the numbers: If there were say 10^15 insects then their moral weight might be in question. However there are actually more like 10^18, which is huge even for very small per-insect weightings.

View more: Next