Comment author: FAWS 23 May 2011 04:59:29PM 1 point [-]

Assume no "future peeking" and Omega only correctly predicting people as difficult to predict as you with 99.9% probability. One-boxing still wins.

Comment author: rstarkov 29 August 2011 04:38:01PM 0 points [-]

While I disagree that one-boxing still wins, I'm most interested in seeing the "no future peeking" and the actual Omega success rate being defined as givens. It's important that I can rely on the 99.9% value, rather than wondering whether it is perhaps inferred from their past 100 correct predictions (which could, with a non-negligible probability, have been a fluke).

Comment author: Amanojack 22 May 2011 05:24:08PM *  -1 points [-]

Newcomb's Problem is silly. It's only controversial because it's dressed up in wooey vagueness. In the end it's just a simple probability question and I'm surprised it's even taken seriously here. To see why, keep your eyes on the bolded text:

Omega has been correct on each of 100 observed occasions so far - everyone [on each of 100 observed occasions] who took both boxes has found box B empty and received only a thousand dollars; everyone who took only box B has found B containing a million dollars.

What can we anticipate from the bolded part? The only actionable belief we have at this point is that 100 out of 100 times, one-boxing made the one-boxer rich. The details that the boxes were placed by Omega and that Omega is a "superintelligence" add nothing. They merely confuse the matter by slipping in the vague connotation that Omega could be omniscient or something.

In fact, this Omega character is superfluous; the belief that the boxes were placed by Omega doesn't pay rent any differently than the belief that the boxes just appeared at random in 100 locations so far. If we are to anticipate anything different knowing it was Omega's doing, on what grounds? It could only be because we were distracted by vague notions about what Omega might be able to do or predict.

The following seemingly critical detail is just more misdirection and adds nothing either:

And the twist is that Omega has put a million dollars in box B iff Omega has predicted that you will take only box B.

I anticipate nothing differently whether this part is included or not, because nothing concrete is implied about Omega's predictive powers - only "superintelligence from another galaxy," which certainly sounds awe-inspiring but doesn't tell me anything really useful (how hard is predicting my actions, and how super is "super"?).

The only detail that pays any rent is the one above in bold. Eliezer is right that one-boxing wins, but all you need to figure that out is Bayes.

EDIT: Spelling

Comment author: rstarkov 22 May 2011 06:28:06PM *  -2 points [-]

It's only controversial because it's dressed up in wooey vagueness

I also happen to think that under-specification of this puzzle adds significantly to the controversy.

What the puzzle doesn't tell us is the properties of the universe in which it is set. Namely, whether the universe permits future to influence the past, which I'll refer to as "future peeking".

(alternatively, whether the universe somehow allows someone within the universe to precisely simulate the future faster than it actually comes - a proposition I don't believe is ever true in any universe defined mathematically).

This is important because if the future can't influence the past, then it is known with absolute certainty that taking two boxes won't possibly change what's in them (this is, after all, a basic given of the universe). Whether Omega has predicted something before is completely irrelevant now that the boxes are placed.

Alas, we aren't told what the universe is like. If that is intentionally part of the puzzle then the only way to solve it would be to enumerate all possible universes, assigning each one a probability of being ours based on all the available evidence, and essentially come up with a probability that "future peeking" is impossible in our universe. One would then apply simple arithmetic to calculate the expected winnings.

Unfortunately P("future peeking allowed") it's one of those probabilities that is completely incalculable for any practical purpose. Thus if "no future peeking" isn't a given, the best answer is "I don't know if taking two boxes is best because there's this one probability I can't actually calculate in practice".

Comment author: rstarkov 25 March 2011 11:37:06AM 2 points [-]

I have found that the logical approach like this one works much more rarely than it doesn't, simply because it appears that people can manage not to trust reason, or to doubt the validity of the (more or less obvious) inferences involved.

Additionally, belief is so emotional that even people who see all the logic, and truly seem to appreciate that believing in God is completely silly, still can't rid themselves of the belief. It's like someone who knows household spiders are not dangerous in any way and yet are more terrified of them than, say, an elephant.

Perhaps what's needed in addition to this is a separate "How to eschew the idea of god from your brain" guide. It would include practical advice collected from various self-admitted ex-believers. Importantly, I think people who have never believed should avoid contributing to such a guide unless they have reasons to believe that they have an extraordinary amount of insight into a believer's mind.

Comment author: rstarkov 25 March 2011 11:42:53AM 1 point [-]

To expand a bit on the first paragraph, I feel that such reasonable arguments are to many people about the same as the proof of Poincaré conjecture is to me: I fully understand the proposition, but I'm not nearly smart enough to follow the proof sufficiently well to be confident it's right.

Importantly, I can also follow the outline of the proof, to see how it's intended to work, but this is of course insufficient to establish the validity of the proof.

So the only real reason I happen to trust this proof is that I already have a pre-established trust in the community who reviewed the proof. But of course the same is also true of a believer who has a pre-established trust in the theist community.

So the guide would require a section on "how to pick authorities to trust", which would explain why it's necessary (impractical to verify everything yourself) and why the scientific community is the best one to trust (highest rate of successful predictions and useful conclusions).

Comment author: rstarkov 25 March 2011 11:37:06AM 2 points [-]

I have found that the logical approach like this one works much more rarely than it doesn't, simply because it appears that people can manage not to trust reason, or to doubt the validity of the (more or less obvious) inferences involved.

Additionally, belief is so emotional that even people who see all the logic, and truly seem to appreciate that believing in God is completely silly, still can't rid themselves of the belief. It's like someone who knows household spiders are not dangerous in any way and yet are more terrified of them than, say, an elephant.

Perhaps what's needed in addition to this is a separate "How to eschew the idea of god from your brain" guide. It would include practical advice collected from various self-admitted ex-believers. Importantly, I think people who have never believed should avoid contributing to such a guide unless they have reasons to believe that they have an extraordinary amount of insight into a believer's mind.

Comment author: AlephNeil 25 March 2011 01:46:09AM *  4 points [-]

I can understand 'supernatural' in the context of the Game Of Life: the user ('God' if you like) violates the laws of physics (in this case by anomalously switching some cells on or off).

Can't we give pretty much the same definition of 'supernatural' in the real world? (Relative to the 'true laws of physics' whatever they are.)

Perhaps we could say that a supernatural event is one that "increases the algorithmic information content of the universe", though formulating this precisely would require some care.

Comment author: rstarkov 25 March 2011 03:54:18AM 4 points [-]

Of course I'd argue that the game of life is not an isolated universe if one can toggle cells in it, and if you consider the whole lot then there's nothing supernatural about the process of cells being toggled.

But this is a good example. I asked about what others mean by "supernatural" and this sounds very close indeed!

Comment author: Manfred 25 March 2011 12:08:35AM 7 points [-]

It's also often used to mean "not material," or "not obeying the same laws as most stuff we see." So the stuff we see around us is the "natural," and things that don't follow natural law are "supernatural."

I suspect most supernatural beliefs are non-reductable just because they're made up by people and people don't generally think in terms of reductionism. I prefer "magic" form for complicated human-interacting ontologically basic things.

Comment author: rstarkov 25 March 2011 12:39:43AM 1 point [-]

Sounds like a reasonable way of putting it. So a weapon shooting invisible (to the human eye) bullets would be classified as "supernatural" by someone from the stone age, because to them, killing someone requires direct contact with a visible weapon or projectile, that has appreciable travel time. Right?

Although "hard science" would have to be excluded from this, even though it contains lots of stuff that doesn't obey the same laws as most stuff we see.

Comment author: ShardPhoenix 24 March 2011 11:47:59PM 3 points [-]

I didn't downvote and it's not an inappropriate topic, but the post is a bit rambling. More thought and editing could allow you to figure out what you're really trying to say, and hence tighten up the post.

Comment author: rstarkov 24 March 2011 11:53:08PM 1 point [-]

I suppose it's not the most concise post I've ever written. Thanks for the feedback!

Comment author: rstarkov 24 March 2011 11:18:08PM 2 points [-]

So from the negative votes I'm guessing that this is not something you guys find appropriate in "discussion"? It would help me as a newcomer if you also suggested what makes it bad :)

The "supernatural" category

8 rstarkov 24 March 2011 08:52PM

The term "supernatural" is frequently used in discussions related to skepticism. I am trying to establish the category that people refer to with this term.

All uses of this term appear to imply a separation of concepts and events into two disjoint categories: "natural" and "supernatural". Some examples of things typically classified into "supernatural": God, ghosts, telepathy, telekinesis, aura. Things typically classified as "natural": animals, rocks, talking, earthquake, body temperature.

I will try to follow the advice given in Similarity Clusters and try to establish some verbal hints as to what causes a concept to be classified into either similarity cluster.


One idea I had is the following: anything we expect to be able to experience, if the necessary prerequisites are met, is "natural"; anything we expect to fail to experience even if we try hard is "supernatural". This seems to work quite well on the concepts mentioned above. This works for unlikely events too: a plane crash is not "supernatural" because if I'm at the right place and the right time then I expect to be able to experience it.

It's still a bit weak for exceedingly unlikely events. For example, proton decay has never been witnessed, and we don't know if it can even occur. But "proton decay" is not classified as "supernatural"; rather as a "hypothesis". Telepathy, however, might for all we know be as rare as proton decay (thus being exceedingly hard to confirm experimentally), and yet it's classified into "supernatural". Something is missing from this verbal hint.

But what?


Approaching this from a different perspective, it appears that one can classify "supernatural" as having the property of being "outside of the universe". On further thought, however, this isn't helpful at all: the latter is not so much a verbal hint as a label in itself.

If taken literally, one might argue that all supernatural things therefore don't exist. They are said to be outside the universe, but we can only experience things within the universe, because anything we can experience must be part of the universe, and thus "inside" it. This is quite useless, however, in my opinion: as used by actual people, the category "supernatural" isn't intended to preclude existence. So this doesn't work.


Could it be that the category "supernatural" is actually completely useless, by offering so little information about the things that belong to it that knowing that something is classified as "supernatural" doesn't tell us very much at all?

Thinking about this led me to the idea that perhaps "supernatural" simply means "something that science has shown false or doesn't accept as a valid theory". That is certainly a property I infer about P when told that P belongs to "supernatural".

This is still quite unsatisfactory. It can't be the only property. People explain away God's undetectability by being "supernatural", intending it as a convincing argument - but even those who do things like this wouldn't claim that "not a valid theory" is an argument in favour of God. They must mean something else.

But what?

Comment author: Roland2 12 March 2008 06:28:41AM 9 points [-]

GBM:

Q: What is the probability for a pseudo-random number generator to generate a specific number as his next output?

A: 1 or 0 because you can actually calculate the next number if you have the available information.

Q: What probability do you assign to a specific number as being it's next output if you don't have the information to calculate it?

Replace pseudo-random number generator with dice and repeat.

Comment author: rstarkov 24 March 2011 03:36:12PM *  5 points [-]

Even more important, I think, is the realization that, to decide how much you're willing to bet on a specific outcome, all of the following are essentially the same:

  • you do have the information to calculate it but haven't calculated it yet
  • you don't have the information to calculate it but know how to obtain such information.
  • you don't have the information to calculate it

The bottom line is that you don't know what the next value will be, and that's the only thing that matters.

View more: Prev | Next