Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Jiro 31 October 2014 09:10:22AM 0 points [-]

and if you are a normal person then you shrug your shoulders, say "damn, that's too bad", and get on with your life; but if you are infused with a sense of heroic responsibility then you devote your life to...

If you're a normal person, the fact that you shrug your shoulders when faced with such things is beneficial because shrugging your shoulders instead of being heroic when faced with the destruction of civilization serves as immunity against crazy ideas and because you're running on corrupted hardware, you probably aren't as good at figuring out how to avoid the destruction of civilization as you think.

Just saying "I'm not going to shrug my shoulders; I'm going to be heroic instead" is removing the checks and balances that are irrational themselves but protect you against bad rationality of other types, leaving you worse off overall.

Comment author: gjm 31 October 2014 11:15:02AM 0 points [-]

I am inclined to agree; I am not a fan of the idea of "heroic responsibility". (Though I think most of us could stand to be a notch or two more heroic than we currently are.)

Comment author: private_messaging 30 October 2014 01:22:17PM *  -1 points [-]

To say that you're underconfident is to say that you believe you're correct more often than you believe yourself to be correct. The claim of underconfidence is not a claim underconfident people tend to make. Underconfident people usually don't muster enough confidence about their tendency to be right to conclude that they're underconfident.

Comment author: gjm 31 October 2014 02:39:41AM 2 points [-]

It's self-contradictory only in the same way as "I believe a lot of false things" is. (Maybe a closer analogy: "I make a lot of mistakes.".) In other words, it make a general claim that conflicts with various (unspecified) particular beliefs one has from time to time.

I am generally underconfident. That is: if I look at how sure I am about things (measured by how I feel, what I say, what in some cases how willing I am to take risks based on those opinions), with hindsight it turns out that my confidence is generally too low. In some sense, recognizing this should automatically increase my confidence levels until they stop being too low -- but in practice my brain doesn't work that way. (I repeat: in some sense it should, and that's the only sense in which saying "I am generally underconfident" is self-contradictory.)

I make a lot of mistakes. That is: if I look at the various things I have from time to time believed to be true, with hindsight it turns out that quite often those beliefs are incorrect. It seems likely that I have a bunch of incorrect current beliefs, but of course I don't know which ones they are.

(Perhaps I've introduced a new inconsistency by saying both "I am generally underconfident" and "I make a lot of mistakes". As it happens, on the whole I think I haven't; in any case that's a red herring.)

Comment author: Lumifer 31 October 2014 01:29:02AM *  0 points [-]

See, you're ignoring the qualifier 'sane' again.

Well, would you like to define it, then? I am not sure I understand your use of this word. In particular, does it involve any specific set of values?

It is not obvious to me that heroic responsibility implies that a thing should be done without cost/benefit analysis or at any cost.

Things done on the basis of cost-benefit analysis are just rational things to do. The "heroic" part must stand for something, no?

I just happen to be fine with most values systems.

Ahem. Most out of which set? Are there temporal or geographical limits?

Is there a specific mechanism by which reducing government power would do good?

That's a complicated discussion that should start with what is meant by "good" (we're back to value systems again), maybe we should take it up another time...

Comment author: gjm 31 October 2014 02:19:11AM 1 point [-]

[...] just rational things to do. The "heroic" part must stand for something, no?

I had always assumed it was intended to stand for doing things that are rational even if they're really hard or scary and unanticipated.

If you do a careful cost-benefit calculation and conclude (depending on your values and beliefs) that ...

  • ... the biggest risk facing humanity in the nearish future is that of a runaway AI doing things we really don't want but are powerless to stop, and preventing this requires serious hard work in mathematics and philosophy and engineering that no one seems to be doing; or
  • ... most of the world's population is going to spend eternity in unimaginable torment because they don't know how to please the gods; or
  • ... there are billions of people much, much worse off than you, and giving away almost everything you have and almost everything you earn will make the world a substantially better place than keeping it in order to have a nicer house, better food, more confidence of not starving when you get old, etc.

and if you are a normal person then you shrug your shoulders, say "damn, that's too bad", and get on with your life; but if you are infused with a sense of heroic responsibility then you devote your life to researching AI safety (and propagandizing to get other people thinking about it too), or become a missionary, or live in poverty while doing lucrative but miserable work in order to save lives in Africa.

If it turns out that you picked as good a cause as you think you did, and if you do your heroic job well and get lucky, then you can end up transforming the world for the better. If you picked a bad cause (saving Germany from the Jewish menace, let's say) and do your job well and get lucky, you can (deservedly) go down in history as an evil genocidal tyrant and one of the worst people who ever lived. And if you turn out not to have the skill and luck you need, you can waste your life failing to solve the problem you took aim at, and end up neither accomplishing anything of importance nor having a comfortable life.

So there are reasons why most people don't embrace "heroic responsibility". But the premise for the whole thing -- without which there's nothing to be heroically responsible about -- is, it seems to me, that you really think that this thing needs doing and you need to do it and that's what's best for the world.

("Heroic responsibility" isn't only about tasks so big that they consume your entire life. You can take heroic responsibility for smaller-scale things too, if they present themselves and seem important enough. But, again, I think what makes them opportunities for heroic responsibility is that combination of importantly worth doing and really intimidating.)

Comment author: Luke_A_Somers 30 October 2014 02:52:20PM 1 point [-]

Meh. If quantum gravity could do it, then any other quantum force could do it.

Comment author: gjm 30 October 2014 07:10:49PM 2 points [-]

I don't think we know anywhere near enough about quantum gravity to be sure of that.

Not that I'd be super-optimistic about "quantum gravitational computers" actually being any use relative to ordinary quantum computers -- but in the absence of an actual working quantum theory of gravity I don't see how we can know they wouldn't make a difference in calef's hypothetical world.

In response to Academic papers
Comment author: gjm 30 October 2014 07:04:10PM 4 points [-]

Pure unhelpful nitpickery: Your title should say "Academic", not "Accadmic". It doesn't really matter, but since it's sitting there in big letters on the main Discussion page it might be worth fixing up.

Comment author: ChristianKl 30 October 2014 01:37:57PM 3 points [-]

Cialdini's book is mostly the third, with a little touch of the second.

And read by people who want to read the first ;)

Comment author: gjm 30 October 2014 07:02:37PM 0 points [-]

And also who want to read the second or the third. But yes, of course, writing for one audience won't stop others taking advantage.

Comment author: Dias 30 October 2014 02:54:47AM 4 points [-]

Suppose I was an unusual moral, unusually insightful used car saleswoman. I have studied the dishonest sales techniques my colleagues use, and because I am unusually wise, worked out the general principles behind them. I think it is plausible that this analysis is new, though I guess it could already exist in an obscure journal.

Is it moral of me to publish this research, or should I practice the virtue of silence?

  • It might help people resist such techniques.
  • It might help salesmen employ these immoral techniques better.
  • Salesmen are more likely to already understand much of the content - vulnerable outsiders would have more to learn
  • Salesmen are more incentivized to learn from my analysis.
  • It is quite interesting to read as a purely abstract matter.
  • I like producing and sharing interesting research.

Obviously the dishonest car salesman is just an example so don't get too tied up on the efficiency of the second hand car market.

Comment author: gjm 30 October 2014 01:32:37PM 2 points [-]

Robert Cialdini did something a bit like this in researching his book "Influence", and so far as I can tell pretty much everyone agrees it's a good thing he wrote it.

I suspect attitudes to your doing this would depend on what your publication looked like. You could write

  • a book called "Secrets of Successful Second-hand Sales", aimed at used car salespeople, advising them on how to manipulate their customers;
  • a book called "Secrets of the Sinister Second-hand Sellers", aimed at used car buyers, advising them on what sort of things they should expect to be done to them and how to see through the bullshit and resist the manipulation;
  • a book called "A Scientific Study of Second-hand Sales Strategies", aimed at psychologists and other interested parties, presenting the information neutrally for whatever use anyone wants to make.

(As an unusually moral person you probably wouldn't actually want to write the first of those books. But some others in a similar situation might.)

My gut reaction to the first would be "ewww", to the second would be "oh, someone trying to drum up sales by attention-grabbing hype",and to the third would be "hey, that's interesting". Other people's guts may well differ from mine. Cialdini's book is mostly the third, with a little touch of the second.

Comment author: Eliezer_Yudkowsky 30 October 2014 12:05:29AM 1 point [-]

Just noticed... Why does this say 'instrumental' rationality? When was that decided, when did it start? I originally suggested the rationality diaries with intent of them being both epistemic and instrumental.

Comment author: gjm 30 October 2014 01:31:04AM 1 point [-]

First rationality diary I can find on LW: 2012-05-14. This one already says "instrumental". It doesn't say it was your idea and seems to imply it was someone else's (which might, among several possibilities, indicate that it isn't actually descended from the original suggestion you have in mind here).

Comment author: sixes_and_sevens 29 October 2014 05:57:00PM 2 points [-]

Making non-trivial posts carries psychological costs that I feel quite acutely. I would love to be able to plough through this (c.f. Comfort Zone Expansion) by making a lot of non-trivial posts.

Unfortunately, making non-trivial posts also carries time costs that I feel quite acutely. I have quite fastidious editorial standards that make writing anything quite time-consuming (you would be alarmed at how much time I've spent writing this response), and this is compounded by engaging in long, sticky discussions.

The Weird Alliances post was an attempt to write something quickly to lower standards, and as a result it was of lower quality than I would have liked. This made the psychological cost greater. I've yet to figure out how to unknot this perverse trade-off between psychological and time costs, but it means I would prefer to space out making posts.

Comment author: gjm 29 October 2014 11:12:02PM 2 points [-]

Ah, OK, understood. Best of luck with the unknotting. (I'd offer advice, but I have much the same problem myself.)

Comment author: sixes_and_sevens 29 October 2014 11:27:26AM 1 point [-]

This seems like a good premise for a post inviting people to contribute their own "magic phrases". Sadly, I've used up my Discussion Post powers by making an idle low-quality post about weird alliances last week. I now need to rest in my crypt for a week or so until people forget about it.

Comment author: gjm 29 October 2014 05:11:09PM 0 points [-]

I've used up my Discussion Post powers [...] I now need to rest in my crypt [...]

OK, I'm confused. (Probably because I'm missing a joke.) Reading the above in isolation I'd take it as indicating that you posted something that got you a big ball o' negative karma, which brought you below some threshold that meant you couldn't post to Discussion any more.

Except that your "weird alliances" post is at +7, and your total karma is over 4k, and your last-30-days karma is over 200, and none of your posts or comments in the last week or so is net negative, and those are all very respectable numbers and surely don't disqualify anyone from doing anything.

So, as I say, I seem to be missing a joke. Oh well.

View more: Next