Trevor_Blake comments on Why do theists, undergrads, and Less Wrongers favor one-boxing on Newcomb? - Less Wrong

15 Post author: CarlShulman 19 June 2013 01:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (299)

You are viewing a single comment's thread.

Comment author: [deleted] 19 June 2013 05:23:46AM *  4 points [-]

Is the Predictor omniscient or making a prediction?

A tangent: when I worked at a teen homeless shelter there would sometimes be a choice for clients to get a little something now or more later. Now won every time, later never. Anything close to a bird in hand was valued more than a billion ultra birds not in the hand. A lifetime of being betrayed by adults, or poor future skills, or both and more might be why that happened. Two boxes without any doubt for those guys. As Predictors they would always predict two boxes and be right.

Comment author: Decius 19 June 2013 05:43:33AM 1 point [-]

He makes a statement about the future which, when evaluated, is true. What's the difference between accurate predictions and omniscience?

On that tangent: WTF? Who creates a system in which they can offer either some help now, or significantly more later, unless they are malicious or running an experiment?

Comment author: ArisKatsaris 19 June 2013 10:23:31AM *  8 points [-]

He makes a statement about the future which, when evaluated, is true. What's the difference between accurate predictions and omniscience?"

So when I look at the source code of a program and state "this program will throw a NullPointerException when executed" or "this program will go into endless loop" or "this program will print out 'Hello World'" I'm being omniscient?

Look, I'm not discussing Omega or Newcomb here. Did you just call ME omniscient because in real life I can predict the outcome of simple programs?

Comment author: Decius 19 June 2013 05:40:04PM -1 points [-]

You are wrong. There can be a power failure at least one time when that program runs, and you have not identified when those will be.

Comment author: ArisKatsaris 19 June 2013 07:08:26PM *  6 points [-]

You are nitpicking. Fine, let's say that Omega is likewise incapable of detecting whether you'll have a heart-attack or be eaten by a pterodactyl. He just knows whether your mind is set on one-boxing or two-boxing.

Did this just remove all your objections about "omniscience" and Newcomb's box, since Omega has now been established to not know if you'll be eaten by a pterodactyl before choosing a box? If so, I suggest we make Omega being incapable of determining death-by-pterodactyl a permanent feature of Omega's character.

Comment author: Nornagest 19 June 2013 05:55:23AM *  8 points [-]

WTF? Who creates a system in which they can offer either some help now, or significantly more later, unless they are malicious or running an experiment?

Situations where you can get something now or something better later but not both come up all the time as consequences of growth, investment, logistics, or even just basic availability issues. I expect it would usually make more sense to do this analysis yourself and only offer the option that does more long-term good, but if clients' needs differ and you don't have a good way of estimating, it may make sense to allow them to choose.

Not that it's much of an offer if you can reliably predict the way the vast majority of them will go.

Comment author: Decius 19 June 2013 05:22:45PM -1 points [-]

If you can make the offer right now, you don't have capital tied up in growth, investment, or logistics. Particularly since what you have available now doesn't cover the current need - all of it will be taken by somebody.

Comment author: [deleted] 19 June 2013 09:20:41AM 2 points [-]

If the Predictor is accurate or omniscient, then the game is rigged and it becomes a different problem. If the Predictor is making guesses then box predicting and box selecting are both interesting to figure out.

WTF? Who creates a system in which they can offer either some help now, or significantly more later, unless they are malicious or running an experiment?

Or you live in a system nobody in particular created (capitalism) and work at a social service with limited resources with a clientele who have no background experience with adults who can be trusted. An employer telling hem "work now and I'll pay you later" is not convincing, while a peanut butter sandwich right now is.

Comment author: Decius 19 June 2013 05:31:54PM 2 points [-]

How about "Here's a sandwich, if you work for me there I will give you another one at lunchtime and money at the end of the day."

It's the case where the immediate reward has to be so much smaller than the delayed reward but still be mutually exclusive that confuses me, not the discounting due to lack of trust.

Comment author: [deleted] 19 June 2013 05:27:54AM 0 points [-]

What does always choosing some now over more later have to do with Newcomb's problem?

Comment author: CAE_Jones 19 June 2013 07:34:18AM 3 points [-]

What does always choosing some now over more later have to do with Newcomb's problem?

Simply stating that box B either contains a million dollars or nothing will make people see the million dollars as more distant than the guaranteed thousand in box A, I imagine. That the probabilities reduce that distance to negligible matters only if the person updates appropriately on that information.