Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Science 19 January 2017 04:46:12AM *  4 points [-]

Well the statistic he cites is either false or hugely misleading depending on how one interprets "unfair and disproportionate rates". It's true that a black is more likely to be killed, arrested, searched by police than a white person. A black is also more likely to commit a violent crime than a white person. And the the two ratios are rather similar.

Furthermore, a black person is overwhelmingly more likely to be killed by a fellow black person as by a cop. In fact, by leading to less police presence, and hence higher crime, in black neighborhoods the BLM movement has lead to a large increase in the number of black deaths. This phenomenon is commonly called the Ferguson effect.

Comment author: Lightwave 19 January 2017 09:52:53AM *  1 point [-]

A black is also more likely to commit a violent crime than a white person.

Isn't it more relevant whether a black person is more likely to commit a violent crime against a police officer (during a search, etc)? After all the argument is that the police are responding to some perceived threat. The typical mostly black-on-black violent crime isn't the most relevant statistic that should be used. Where are the statistics about how blacks respond to the police?

Comment author: Lightwave 29 November 2016 10:41:25PM *  9 points [-]

Funny you should mention that..

AI risk is one of the 2 main focus areas for the The Open Philanthropy Project for this year, which GiveWell is part of. You can read Holden Karnofsky's Potential Risks from Advanced Artificial Intelligence: The Philanthropic Opportunity.

They consider that AI risk is high enough on importance, neglectedness, and tractability (their 3 main criteria for choosing what to focus on) to be worth prioritizing.

In response to comment by Dagon on Crony Beliefs
Comment author: entirelyuseless 05 November 2016 05:05:22PM 0 points [-]

I am already sure that all communities include as core beliefs or very close to core, things that I am very confident are false.

I learned that from experience, but it is easy to come up in hindsight with theoretical reasons why that would be likely to be the case.

Comment author: Lightwave 20 November 2016 11:21:58AM 0 points [-]

things that I am very confident are false

Could you give any example?

Comment author: Lightwave 12 October 2016 04:48:07PM 5 points [-]
Comment author: Lightwave 12 October 2016 03:32:19PM *  1 point [-]
Comment author: ThoughtSpeed 28 September 2016 08:08:45AM 2 points [-]

Is that for real or are you kidding? Can you link to it?

Comment author: Lightwave 28 September 2016 10:05:12AM *  2 points [-]

He's mentioned it on his podcast. It won't be out for another 1.5-2 years I think.

Also Sam Harris recently did a TED talk on AI, it's now up.

Comment author: smk 26 September 2016 11:50:14PM *  3 points [-]

Has Sam Harris stated his opinion on the orthogonality thesis anywhere?

Comment author: Lightwave 27 September 2016 09:03:08AM 3 points [-]

He's writing an AI book together with Eliezer, so I assume he's on board with it.

Comment author: Lightwave 06 April 2016 07:48:51AM 2 points [-]

Can't we just add a new 'link' post type to the current LW? Links and local posts would both have comment threads (here on LW), the only difference is the title of the linked post would link to an outside website/resource.

Comment author: Lightwave 05 April 2016 12:34:06PM *  3 points [-]

Should we try to promote the most valuable/important (maybe older?) Less Wrong content on the front page? Currently the front page features a bunch of links and featured articles that don't seem to be organized in any systematic way. Maybe Less Wrong would be more attractive/useful to new people if they could access the best the site has to offer directly from the front page (or at least more if it, and in a systematic way)?

Comment author: Lightwave 20 March 2016 09:17:22AM 0 points [-]

Target: a good post every day for a year.

Why specifically 1/day? It seems a bit too much. Why not e.g. ~3/week?

View more: Next