Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Daniel_Burfoot 04 May 2016 02:02:56PM 5 points [-]

Though I enthusiastically endorse the concept of rationality, I often find myself coming to conclusions about Big Picture issues that are quite foreign to the standard LW conclusions. For example, I am not signed up for cryonics even though I accept the theoretical arguments in favor of it, and I am not worried about unfriendly AI even though I accept most of EY's arguments.

I think the main reason is that I am 10x more pessimistic about the health of human civilization than most other rationalists. I'm not a cryonicist because I don't think companies like Alcor can survive the long period of stagnation that humanity is headed towards. I don't worry about UFAI because I don't think our civilization has the capability to achieve AI. It's not that I think AI is spectacularly hard, I just don't think we can do Hard Things anymore.

Now, I don't know whether my pessimism is more rational than others' optimism. LessWrong, and rationalists in general, probably have a blind spot relative to questions of civilizational inadequacy because those questions relate to political issues, and we don't talk about politics. Is there a way we can discuss civilizational issues without becoming mind-killed? Or do we simply have to accept that civilizational issues are going to create a large error bar of uncertainty around our predictions?

Comment author: knb 04 May 2016 09:46:52PM 3 points [-]

It's not that I think AI is spectacularly hard, I just don't think we can do Hard Things anymore.

I'm sympathetic to the idea that we can't do Hard Things, at least in the US and much of the rest of the West. Unfortunately progress in AI seems like the kind of Hard Thing that still is possible. Stagnation has hit atoms, not bits. There does seem to be a consensus that AI is not a stagnant field at all, but rather one that is consistently progressing.

Comment author: knb 02 May 2016 10:51:30AM *  1 point [-]

BBC News is running a story claiming that the creator of Bitcoin known as Satoshi Nakamoto is an Australian named Craig Wright.

In response to Positivity Thread :)
Comment author: knb 28 April 2016 11:04:44PM 1 point [-]

I found this to be a cheerful video, about people working on fusion. (It's a promo, so dark arts warning applies.)

Comment author: username2 28 April 2016 09:58:42AM 1 point [-]

I don't like this idea, but people, please do not downvote Daniel just because you disagree. Downvote thumb is not for disagreements, it's for comments that don't add anything to the discussion.

Comment author: knb 28 April 2016 10:55:38PM 1 point [-]

Downvote thumb is not for disagreements, it's for comments that don't add anything to the discussion.

Who says?

Comment author: Gleb_Tsipursky 25 April 2016 04:22:50AM -1 points [-]

And given the grammar and back history of this user, I'm pretty since it's another Eugene Neir sock puppet. Also reported to the admin.

Comment author: knb 25 April 2016 07:47:03AM 0 points [-]

Or maybe he was referencing how the Poles' polls are indeed very anti-immigration. ;-)

Comment author: knb 20 April 2016 12:30:28AM 2 points [-]

Looks like Andrea Rossi's E-Cat cold fusion scam is finally reaching its end-phase. Some previous LW discussion here, here and here.

Comment author: Viliam 13 April 2016 01:09:01PM *  3 points [-]

Sorry for mindkilling content, but I remember reading on LW long ago that the political left is supposedly morally different, because it doesn't use the "purity/disgust" moral axis.

Then I found these photos online, and I wonder whether that is the microexpression (except there seems to be nothing "micro" when these people do it) of disgust. Or am I reading the expression wrong?

My point is that if someone has this expression pretty much stuck on their face, I find it quite difficult to believe that they don't care about the "purity/disgust" axis.

So what exactly is the lesson here?

  • Is the hypothesis about the political left not using the "purity/disgust" axis wrong?
  • Are SJWs psychologically very different from the typical left?
  • Are the people on the photo very different from typical SJWs, or are their expressions very unrepresentative of their usual behavior?
  • Any other explanation?
Comment author: knb 17 April 2016 11:46:54PM 0 points [-]

The photos you selected look more like the "hate" microexpression from your link. Also, why is Anna Kendrick considered an SJW?

Comment author: DanielDeRossi 17 April 2016 06:42:39AM 0 points [-]

So I was wondering what career is best in terms of being able to accumulate wealth and having a decent quality of life. I've heard finance jobs are good.

Comment author: knb 17 April 2016 11:36:19PM 0 points [-]

Actuaries are consistently near the top in terms of job satisfaction, enjoy stable employment, and make a good amount of money (I believe low six-figure is common.). Other advantages are high income you often don't need a degree in the field as long as you can pass the rigorous licensing tests. However it does require a lot of specific knowledge and good mathematical ability.

Comment author: cousin_it 14 April 2016 12:06:56PM *  2 points [-]

Cebcurpl nobhg Uneel raqvat gur jbeyq vf haerfbyirq, cebcurpl nobhg qrsrngvat qrngu vf haerfbyirq, rirelguvat nobhg Ngynagvf naq gur angher bs zntvp vf haerfbyirq. Vs gurfr jrera'g gur znva dhrfgvbaf va lbhe zvaq juvyr ernqvat UCZBE, vqx jung gb fnl. Nyfb, ab zntvpny erfrnepu unccraf.

Comment author: knb 15 April 2016 01:46:07AM 1 point [-]

V arire tbg gur vzcerffvba gung UCZBE jnf nobhg pbzvat hc jvgu engvbanyvmngvbaf sbe gur fvyyl zntvp ehyrf WX Ebjyvat vairagrq sbe n puvyqera'f fgbel. Vg jnf nobhg cebzbgvat Lhqxbjfxl'f vqrnf nobhg engvbanyvgl naq nagv-qrnguvfz hfvat gur Uneel Cbggre jbeyq nf n pbairavrag ubbx.

Comment author: gjm 05 April 2016 02:11:42PM 0 points [-]

We have a lot more infrastructure than Europe had at the time of the Black Death. If we lost 75% of the population, it might devastate things like the power grid, water supply and purification, etc.

We have (I think) more complicatedly interdependent institutions than Europe at the time of the Black Death. Relatively small upheavals in, e.g., our financial systems can cause a lot of chaos, as shown by our occasional financial crises. If 75% of the population died, how robust would those systems be?

The following feels like at least a semi-plausible story. Some natural or unnatural disaster wipes out 75% of the population. This leads to widescale failure of infrastructure, finance, and companies. In particular, we lose a lot of chip factories and oil wells. And then we no longer have the equipment we need to make new ones that work as well as the old ones did, and we run out of sufficiently-accessible oil and cannot make fast enough technological progress to replace it with solar or nuclear energy on a large scale, nor to find other ways of making plastics. And then we can no longer make the energy or the hardware to keep our civilization running, and handling that the best we can takes up all our (human and other) resources, and even if in principle there are scientific or technological breakthroughs that would solve that problem we no longer have the bandwidth to make them.

The human race would survive, of course. But the modern highly technology-dependent world would be pretty much screwed.

(I am not claiming that the loss of 75% of the population would definitely do that. But it seems like it sure might.)

In response to comment by gjm on Lesswrong 2016 Survey
Comment author: knb 11 April 2016 02:04:54AM 0 points [-]

The following feels like at least a semi-plausible story.

It doesn't feel plausible to me. You don't need computer chips or oil to have industry and science. Industry + science would eventually progress back to modern capabilities, but probably faster due to people rediscovering old knowledge preserved here and there.

View more: Next