Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Daemon 16 September 2013 09:50:04PM *  0 points [-]

Fine, Eliezer, as someone who would really like to think/believe that there's Ultimate Truth (not based in perception) to be found, I'll bite.

I don't think you are steelmanning post-modernists in your post. Suppose I am a member of a cult X -- we believe that we can leap off of Everest and fly/not die. You and I watch my fellow cult-member jump off a cliff. You see him smash himself dead. I am so deluded ("deluded") that all I see is my friend soaring in the sky. You, within your system, evaluate me as crazy. I might think the same of you.

You might think that the example is overblown and this doesn't actually happen, but I've had discussions (mostly religious) in which other people and I would look at the same set of facts and see radically, radically different things. I'm sure you've been in such situations too. It's just that I don't find it comforting to dismiss such people as 'crazy/flawed/etc.' when they can easily do the same to me in their minds/groups, putting us in equivalent positions -- the other person is wrong within our own system of reference (which each side declares to be 'true' in describing reality) and doesn't understand it.

I think this ties in with http://lesswrong.com/lw/rn/no_universally_compelling_arguments/ .

Now, I'm not trying to be ridiculous or troll. I really, really want to think that there's one truth and that rationality -- and not some other method -- is the way to get to it. But at the very fundamental level (see http://lesswrong.com/lw/s0/where_recursive_justification_hits_bottom/ ), it seems like a choice between picking from various axioms.

I wish the arguments you presented here convinced me, I really do. But they haven't, and I have no way of knowing that I'm not in some matrix-simulated world where everything is, really, based on how my perception was programmed. How does this work for you -- do you just start off with assumption that there is truth, and go from there? At some fundamental level, don't you believe that your perception just.. works and describes reality 'correctly,' after adjusting for all the biases? Please convince me to pick this route, I'd rather take it, instead of waiting for a philosopher of perfect emptiness to present a way to view the world without any assumptions.

(I understand that 'everything is relative to my perception' gets you pretty much nowhere in reality. It's just that I don't have a way to perfectly counter that, and it bothers me. And if I did find all of your arguments persuasive, I would be concerned if that's just an artifact of how my brain is wired [crudely speaking] -- while some other person can read a religious text and, similarly, find it compelling/non-contradictory/'makes-sense-ey' so that the axioms this person would use wouldn't require explanation [because of that other person's nature/nurture]).

If I slipped somewhere myself, please steelman my argument in responding!

Comment author: duckduckMOO 21 July 2014 01:38:11AM *  -1 points [-]

The downvotes and no reply are a pretty good example of what's wrong with less wrong. Someone who is genuinely confused should not be shooed away then insulted when they ask again.

First of all remember to do and be what's best. If this doubt is engendering good attitudes in you, why not keep it? The rest of this is premised on it not helping or being unhelpful.

External reality is much more likely than being part of a simulation which adjusts itself to your beliefs because a simulation which adjusts itself to your beliefs is way, way more complicated. It requires more assumptions than a single level reality. If there's a programmer of your reality, that programmer has a reality too, which needs to be explained in the same way a single level one should as does their ability to program such a lifelike entity and all sorts of other things.

More fundamentally though, this is just the reality you live in, whatever its position in a potential reality chain.

If we are being simulated, trying to metagame potential matrix lords' dispositions/ ask for favours/look for loopholes/care less about its contents is only a bug of human cognition. If this is a simulation, it is inhabited by at least me, and almost certainly many other people, and there's real consequences for all of us. If you don't earn your simulation rent you'll get kicked out of your simulation place. Qualify everything with "potentially simulated-" and it changes nothing. "Real" just isn't a useful (and so, important) distinction to make in first person reguarding simulations.

and/or you could short circuit any debilitating doubt using fighting games or sports (or engaging in other similiar activities) which illustrate the potential importance of leaning all in towards the evidence without worrying about the nature of things, and are a good way to train that habit.

Also, in this potentially simulated world, social pressure is a real thing. The more infallible and sensitive you make your thinking (or allow it to be) the more prone it is to interference from people who want to disrupt you, unless you're willing to cut yourself off from people to some extent. When someone gives you an idiotic objection (and there are a lot of those here), the more nuanced your own view actually is the harder it will be to explain and the less likely people will listen fairly. You could just say whatever you think is going to influence them best but that adds a layer of complexity and is another tradeoff. If you're not going to try to be a philosopher of perfect emptiness taking external reality as an assumption is the most reliable to work with your human mind, and not confuse it: how are you supposed to act if there are matrix lords? There's nothing to go on so any leaning such beliefs (beliefs which shouldn't change your approaches or attitudes) prompts is bound to be a bias.

Comment author: duckduckMOO 11 June 2014 05:20:04PM *  -1 points [-]

I think the "...and that's terrible" is pretty clearly implied. What exactly is wrong with the quote? It looks like you're dissecting a straightforward appeal to people's (stated or real) anti-unfairness values, as if it's a given that it's dishonest. I don't get it.

Comment author: Viliam_Bur 06 June 2014 05:18:23PM *  19 points [-]

I think people go to Slate Star Codex, because that's where Scott writes his articles, not because of the voting mechanism.

From the paper:

authors of negatively evaluated content are encouraged to post more, and their future posts are also of lower quality

Seen that at LW a few times. At some moment the user's karma became so low they couldn't post anymore, or perhaps an admin banned them. From my point of view, problem solved.

I think it would be useful to distinguish between systems where the downvoted comments remain visible, and where the downvoted comments are hidden.

I am reading another website, where the downvoted comments remain proudly visible, with the number of downvotes, and yes, it seem to enrage the user to write more and more of the same stuff. My hypothesis is that some people perceive downvotes as rewards (maybe they love to make people angry, or they feel they are on a crusade and the downvotes mean they successfully hurt the enemy), and these people are encouraged by downvoting. Hiding the comment, and removing the ability to comment, now that is a punishment.

Comment author: duckduckMOO 07 June 2014 10:57:40PM *  1 point [-]

"some people perceive downvotes as rewards"

Is this just a dig at people vehemently defending downvoted posts or are you serious in calling this a hypothesis?

Comment author: Viliam_Bur 25 August 2013 06:15:36PM 1 point [-]

Why is this quote upvoted?

Maybe because of this part:

Players [...] equated lots of dice rolling with the game being "more random" even though that contradicts the actual math.

Comment author: duckduckMOO 27 September 2013 03:49:49PM *  1 point [-]

Rolling 10 dice instead of one makes the game less random. Rolling dice often instead of rarely makes the game more random. This game rolls dice for every attack and not that many. The dude said people complained about lots of dice rolling, not rolling lots of dice. Yeah, obviously if you roll 10 dice its less random than rolling one but what are the chances card game enthusiasts: people geeky enough to play star wars TCG don't understand that basic part of probability? It's far more likely that people were annoyed at lots of dice rolling, not the amount of dice you roll each time. Which matches the reported complaints of the players. Not that I'd expect an accurate report of the players positions when making excuses for why rolling dice in a card game is a bad idea.

Comment author: cody-bryce 03 August 2013 04:48:51AM 0 points [-]
Comment author: duckduckMOO 23 August 2013 06:41:24PM *  1 point [-]

Why shouldn't they be? The idea that if you don't rate yourself highly no one should is just an excuse for shitty instincts.

Obviously it's a useful piece of nonsense to tell yourself. People are more likely to come to your side if you are confident. But the explicit reasoning is reprehensible. (not that any explicit reasoning probably went in, it's such a common idea that it is repeated without thought. It's almost a universal applause light.)

This is more of an irrationality quote. A bit of of paper thin justification for a shitty but common sentiment which it's useful to adopt rather than notice.

Comment author: Particleman 02 August 2013 06:07:05AM 38 points [-]

In 2002, Wizards of the Coast put out Star Wars: The Trading Card Game designed by Richard Garfield.

As Richard modeled the game after a miniatures game, it made use of many six-sided dice. In combat, cards' damage was designated by how many six-sided dice they rolled. Wizards chose to stop producing the game due to poor sales. One of the contributing factors given through market research was that gamers seem to dislike six-sided dice in their trading card game.

Here's the kicker. When you dug deeper into the comments they equated dice with "lack of skill." But the game rolled huge amounts of dice. That greatly increased the consistency. (What I mean by this is that if you rolled a million dice, your chance of averaging 3.5 is much higher than if you rolled ten.) Players, though, equated lots of dice rolling with the game being "more random" even though that contradicts the actual math.

Comment author: duckduckMOO 23 August 2013 05:58:42PM *  -3 points [-]

Unless you're rolling an impractical number of dice for every attack having your attacks do random damage (and not 22-24 like in MMORPGs but 1X-6X) is incredibly random. Even if you are rolling a ridiculous number of dice the game can still be decided by one roll leaving a creature on the board or killing it by one or two points of damage.

What maths says that rolling dice doesn't make the game more random? Maybe he means the game is overall less random, but I don't see any argument for that, or reference to evidence of that claim.

If the reason for the game's failure was that people thought it lacked skill less additional randomness is not a decision to defend even if people were slightly overestimating the randomness.

Having to roll dice in a card game is kind of a slap in the face too. In other card games you draw your cards then make the most of them. There's 0 randomness to worry about except right when you draw your card or your opponent draws theirs (but you are often happily ignorant of whether they play a card from their hand or that they drew except in certain circumstances.) You can count cards and play based on what is left in your deck, or you know is not in your deck anymore.

Also, unlike miniature games, card games pretty much never start pre-deployed. You start with nothing on the board. If your turn one card kills his turn one card because of a dice roll then he has nothing on the board and you have a creature. In a miniature game if you kill more of his guys on turn one because of dice rolls you have an army that is that much smaller.

Why is this quote upvoted?

Comment author: duckduckMOO 29 June 2013 08:28:37AM 0 points [-]

The obvious guess is that theists are more comfortable imagining their decisions to be, at least in principle, completely predictable and not fight the hypothetical. Perhaps atheists are more likely to think they can trick omega because they are not familiar and comfortable with the idea of a magic mind reader so they don't tend to properly integrate the stipulation that omega is always right.

Comment author: AspiringRationalist 26 April 2013 03:41:40AM *  3 points [-]

God doesn't value self-modification. God values faith. One of the properties of faith is that self-modification cannot create faith that did not previously exist.

You seem to be privileging the Abrahimic hypothesis. Of the vast space of possible gods, why would you expect that variety to be especially likely?

Comment author: duckduckMOO 27 April 2013 01:13:15PM *  0 points [-]

Hell is an abrahamic (Islamic/christian only I think) thing. To the extent that we should automatically discount inferences about a God's personality based on christianity/Islam we should also discount the possibility of hell.

In response to Pascal's wager
Comment author: duckduckMOO 22 April 2013 07:45:33AM *  2 points [-]

Is the spacing less annoying now? It wasn't at random: it had 4 gaps between topics, 2 between points and one in a few minor places were I just wanted to break it up. The selection of that scheme was pretty much random though. I just spaced it like I would read it out loud. Which was kind of stupid. I can't expect people to read it in my voice. Anyway is this any better?

Got rid of the "and I think quite good." I just meant I liked it enough to want to share it in a discussion post. I assume that's not the interpretation that was annoying people. How did people read it that made it a crackpot signal?

Pascal's wager

-10 duckduckMOO 22 April 2013 04:41AM

I started this as a comment on "Being half wrong about pascal's wager is even worse" but its really long, so I'm posting it in discussion instead.


Also I illustrate here using negative examples (hell and equivalents) for the sake of followability and am a little worried about inciting some paranoia so am reminding you here that every negative example has an equal and opposite positive partner. For example pascal's wager has the opposite where accepting sends you to hell, it also has the opposite where refusing sends you to heaven. I haven't mentioned any positive equivalents or opposites below. Also all of these possibilities are literally effectively 0 so dun worry.


"For so long as I can remember, I have rejected Pascal's Wager in all its forms on sheerly practical grounds: anyone who tries to plan out their life by chasing a 1 in 10,000 chance of a huge pay-off is almost certainly doomed in practice.  This kind of clever reasoning never pays off in real life..."


Pascal's wager shouldn't be in in the reference class of real life. It is a unique situation that would never crop up in real life as you're using it. In the world in which pascal's wager is correct you would still see people who plan out their lives on a 1 in 10000 chance of a huge pay-off fail 9999 times out of 10000. Also, this doesn't work for actually excluding pascal's wager. If pascal's wager starts off excluded from the category real life you've already made up your mind so this cannot quite be the actual order of events.


In this case 9999 times you waste your Christianity and 1/10000 you don't go to hell for eternity which is, at a vast understatement, much worse than 10000 times as bad as worshipping god even at the expense of the sanity it costs to force a change in belief, the damage it does to your psyche to live as a victim of self inflicted Stockholm syndrome, and any other non obvious cost: With these premises choosing to believe in God produces infinitely better consequences on average.


Luckily the premises are wrong. 1/10000 is about 1/10000 too high for the relevant probability. Which is:

the probability that the wager or equivalent, (anything whose acceptance would prevent you going to hell is equivalent) is true


the probability that its opposite or equivalent, (anything which would send you to hell for accepting is equivalent), is true 


Equivalence here refers to what behaviours it punishes or rewards. I used hell because it is in the most popular wager but it applies to all wagers. To illustrate: If its true that there is one god: ANTIPASCAL GOD, and he sends you to hell for accepting any pascal's wager, then that's equivalent to any pascal's wager you hear having an opposite (no more "or equivalent"s will be typed but they still apply) which is true because if you accept any pascal's wager you go to hell. Conversely, If PASCAL GOD is the only god and he sends you to hell unless you accept any pascal's wager, that's equivalent to any pascal's wager you hear being true.


The real trick of pascals wager is the idea that they're generally no more likely than their opposite. For example, there are lots of good, fun, reasons to assign the Christian pascal's wager a lower probability than its opposite even engaging on a Christian level:


Hell is a medieval invention/translation error: the eternal torture thing isn't even in the modern bibles.

The belief or hell rule is hella evil and gains credibility from the same source (Christians, not the bible) who also claim that god is good as a more fundamental belief, which directly contradicts the hell or belief rule.

The bible claims that God hates people eating shellfish, taking his name in vain, and jealousy. Apparently taking his name in vain is the only unforgivable sin. So if they're right about the evil stuff, you're probably going to hell anyway.

It makes no sense that god would care enough about your belief and worship to consign people to eternal torture but not enough to show up once in a while.

it makes no sense to reward people for dishonesty.

The evilness really can't be overstated. eternal torture as a response to a mistake which is at its worst due to stupidity (but actually not even that: just a stacked deck scenario), outdoes pretty much everyone in terms of evilness. worse than pretty much every fucked up thing every other god is reputed to have done put together. The psychopath in the bible doesn't come close to coming close to this.


The problem with the general case of religious pascal's wagers is that people make stuff up (usually unintentionally) and what made up stuff gains traction has nothing to do with what is true. When both Christianity and Hinduism are taken seriously by millions (as were the Roman/Greek gods, and Viking gods, and Aztec gods, and Greek gods, and all sorts of other gods at different times, by large percentages of people) mass religious belief is 0 evidence. At most one religion set (e.g. Greek/Roman, Christian/Muslim/Jewish, etc) is even close to right so at least the rest are popular independently of truth.


The existence of a religion does not elevate the possibility that the god they describe exists above the possibility that the opposite exists because there is no evidence that religion has any accuracy in determining the features of a god, should one exist.


You might intuitively lean towards religions having better than 0 accuracy if a god exists but remember there's a lot of fictional evidence out there to generalise from. It is a matter of judgement here. there's no logical proof for 0 or worse accuracy (other than it being default and the lack of evidence) but negative accuracy is a possibility and you've probably played priest classes in video games or just seen how respected religions are and been primed to overestimate religion's accuracy in that hypothetical. Also if there is a god it has not shown itself publicly in a very long time, or ever. So it seems to have a preference for not being revealed.  Also humans tend to be somewhat evil and read into others what they see in themselves. and I assume any high tier god (one that had the power to create and maintain a hell, detect disbelief, preserve immortal souls and put people in hell) would not be evil. I think without bad peers or parents there's no reason to be evil. I think people are mostly evil in relation to other people. Being evil or totally unscrupled has benefits among humans which a god would not get. So I religions a slight positive accuracy in the scenario where there is a god but it does not exceed priors against pascal's wager (another one is that they're pettily human) or perhaps even the god's desire to stay hidden. 


Even if God itself whispered pascal's wager in your ear there is no incentive for it to actually carry out the threat: 


There is only one iteration.


These threats aren't being made in person by the deity. They are either second hand or independently discovered so:

The deity has no use for making the threat true, to claim it more believably, as it might if it was an imperfect liar (at a level detectable by humans) that made the threats in person.

The deity has total plausible deniability.

Which adds up to all of the benefits of the threat having already being extracted by the time the punishment is due and no possibility of a rep hit (which wouldn't matter anyway.)


So, All else being equal. i.e. unless the god is the god of threats or pascal's wagers (whose opposites are equally likely):


If God is good (+ev on human happiness -ev on human sadness that sort of thing), actually carrying out the threats has negative value.

If god is scarily-doesn't-give-a-shit-neutral to humans, it still has no incentive to actually carry out the threat and a non zero energy cost.

if god gives the tiniest most infinitesimal shit about humans its incentive to actually carry out the threat is negative.


If God is evil you're fucked anyway:

The threat gains no power by being true, so the only incentive a God can have for following through is that it values human suffering. If it does, why would it not send you to hell if you believed in it? (remember that the god of commitments is as likely as the god of breaking commitments)


Despite the increased complexity of a human mind I think the most (not saying its at all likely just that all others are obviously wrong) likely motivational system for a god which would make it honour the wager is that that God thinks like a human and therefore would keep its commitment out of spite or gratitude or some other human reason. So here's why I think that one is wrong. It's generalizing from fictional evidence: humans aren't that homogeneous (and one without peers would be less so), and if a god gains likelihood to keep a commitment from humanness it also gains not -designed-to-be-evil-ness that would make it less likely to make evil wagers.  It also has no source for spite or gratitude, having no peers. Finally could you ever feel spite towards a bug? Or gratitude? We are not just ants compared to a god, we're ant-ant-ant-etc-ants.


Also there's the reasons that refusing can actually get you in trouble:  bullies don't get nicer when their demands are met. It's often not the suffering they're after but the dominance, at which point the suffering becomes an enjoyable illustration of that dominance.  As we are ant-ant-etc-ants this probability is lower but The fact that we aren't all already in hell suggests that if god is evil it is not raw suffering that it values. Hostages are often executed even when the ransom is paid. Even if it is evil, it could be any kind of evil: its preferences cannot have been homogenised by memes and consensus.


There's also the rather cool possibility that if human-god is sending people to hell, maybe its for lack of understanding. If it wants belief it can take it more effectively than this. If it wants to hurt you it will hurt you anyway. Perhaps peerless, it was never prompted to think through the consequences of making others suffer. Maybe god, in the absence of peers just needs someone to explain that its not nice to let people burn in hell for eternity. I for one remember suddenly realising that those other fleshbags hosted people. I figured it out for myself but if I grew up alone as the master of the universe maybe I would have needed someone to explain it to me.


View more: Next