Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: ike 20 February 2017 03:49:32AM *  0 points [-]

Has anyone rolled the die more than once? If not, it's hard to see how it could converge on that outcome unless everybody that's betting saw a 3 (even a single person seeing differently should drive the price downward). Therefore, it depends on how many people saw rolls, and you should update as if you've seen as many 3s as other people have bet.

You should bet on six if your probability is still higher than 10%.

If the prediction market caused others to update previously then it's more complicated. Probably you should assume it reflects all available information, and therefore exactly one 3 was seen. Ultimately there's no good answer because there's Knightian uncertainty in markets.

[Link] Attacking machine learning with adversarial examples

2 ike 17 February 2017 12:28AM

[Link] Gates 2017 Annual letter

3 ike 15 February 2017 02:39AM

[Link] Raymond Smullyan has died

3 ike 12 February 2017 02:20PM
Comment author: ike 05 February 2017 08:17:04PM 2 points [-]

But is this because of a fault of the Hollywood system, or is it because there are few significant movie story ideas left that have not been done?

Neither: revealed preferences of consumers are in favor of reboots so that's what gets made. That's only a "fault" if your preferences differ from that of most consumers.

(Although I've heard someone argue that piracy made independent films less viable: to the extent consumers would be willing to pay were no pirate option available, but lack of such payments causes fewer films to be made, that would be a market failure argument. I don't really have enough knowledge to judge that as an explanation.)

Comment author: ike 24 December 2016 11:35:44PM 1 point [-]

The other tells you that it will be between $50 and $70 million, with an average of $10 million.

Typo?

Comment author: Raemon 09 December 2016 07:20:35PM 3 points [-]

This has always been how I thought about it. (I consider myself approximately an Average Utilitarian, with some caveats that this is more descriptive than normative)

One person who disagreed with me said: "you are not randomly born into 'a person', you are randomly born into 'a collection of atoms.' A world with fewer atoms arranged into thinking beings increases the chance that you get zero utility, not whatever the average utility from among whichever patterns have developed consciousness"

I disagree with that person, but just wanted to float it as an alternate way of thinking.

Comment author: ike 09 December 2016 08:02:13PM 1 point [-]

It's the difference between SIA and SSA. If you work with SIA, then you're randomly chosen from all possible beings, and so in world B you're less likely to exist.

[Link] A Few Billionaires Are Turning Medical Philanthropy on Its Head

0 ike 04 December 2016 03:08PM
Comment author: ike 29 November 2016 10:47:16PM 1 point [-]

2 points:

  1. I've used something similar when evaluating articles: I ask "what statement of fact would have to be true to make the main vague conclusion of this article correct"? Then I try to figure out if that fact is correct.

2.

For instance: (wildly asserts) "I bet if everyone wore uniforms there would be a fifty percent reduction in bullying." (pauses, listens to inner doubts) "Actually, scratch that—that doesn't seem true, now that I say it out loud, but there is something in the vein of reducing overt bullying, maybe?"

A problem with doing that is that saying something may "anchor" you into giving the wrong confidence level. You might be underconfident since you're doing this without data, or you might just expect yourself to not believe it on a second look.

Comment author: ike 02 November 2016 11:58:22PM 3 points [-]

You're implicitly assuming that the way one candidate spends the money is completely valueless and the way the other spends it is maximally efficient. Also, that 75% influence over budget is way off, come on.

View more: Next