All of PlacidPlatypus's Comments + Replies

A mistake is not a thing, in and of itself, it's just the entire space of possible games outside the very narrow subset that lead to victory.

Minor nitpick, surely you mean possible moves, rather than possible games? The set of games that lead to defeat is necessarily symmetrical with the set that lead to victory, aside from the differences between black and white.

Is the Prisoners' Dilemma really the right metaphor here? I don't really get what the defector gains. Sure, I like them better for being so accommodating, but meanwhile they're paying the costs of giving me what I want, and if they try to invoke some kind of quid pro quo than all the positive feelings go out the window when I find out they were misleading me.

-2Kawoomba
Think of it as having an additional tool in your shed, a really important one: it confers unto you an additional degree of freedom: You can manipulate someone else's state of mind by signalling various faux states of mind of your own (no longer are social signals a tell-culture mandated 1-to-1 mapping, but you can choose whatever input leads to the desired reaction). Social signals and the benefits they confer are sufficiently vague that often you won't find out they were misleading you. Or you may find out ("The last years that person X worked for me I always thought she looked up to and admired me, turns out she always just pretended so she could keep the job!"), but the defector already reaped the (transient in time) rewards. Nothing is forever, the traitor can milk you like a gullible cow (or a gullicalf, living in California) then leave, harm done.

Silver's model already at least attempts to account for fundamentals and reversion to the mean, though. You could argue that the model still puts too much weight on polls over fundamentals, but I don't see a strong reason to prefer that over the first interpretation of just taking it at face value.

-2Eugine_Nier
Has there been any analysis of how accurate Silver's predictions have been in the past?

Nate Silver's model also moved toward Obama, so it's probably reflecting something real to some extent.

2Alejandro1
But the gains have been already cancelled by Romney's better performance in the first debate. You could spin this in two ways. One one hand, you could argue that the "47%" comment did move the polls, and that ceteris paribus it would have reduced significantly Romney's chances of winning. On the other hand, you could say that ceteris should not be expected to be paribus; polls are expected to shift back and forth, and regress to the mean (where "the mean" is dictated by the fundamentals--incumbency, state of the economy, etc), and that if 47% and the debate hadn't happened, other similar things would have.

Decius is right that there aren't really spoilers, but I would argue that your time would be better spent reading HP:MOR than the discussion.

Something tells me that the note would be more likely to say something like "DO NOT MESS WITH TIME".

0Decius
Really? Is that what happened just before he got the time-turner? As I recall, he was trying to demonstrate a way to solve any problem in C time, where C is the time required to falsify a proposed solution. That's different than realizing that you have a higher chance of destroying Azkaban in an hour if you help yourself eight times.

At least for myself , I first heard of Eliezer via the HPMOR TV Tropes page. There's a good chance I would have read the sequences sooner or later even if I hadn't (my brother found them independently and recommended them), but it definitely helped.

And I wouldn't say I was an idiot before, but twenty minutes of conversation with myself from a couple years ago might change my mind. And of course it's hard to tell how much of the difference is LW's influence and how much is just a matter of being older and wiser.

I would say that he was making at least the argument that "this level of responsibility is something you should adopt if you want to be a hero", and probably the more general argument that "people should adopt this attitude toward responsibility.

"We need to switch to alternative energies such as wind, solar, and tidal. The poor are lazy ... Animal rights"

I don't think these fit. Regardless of whether you agree with them, they are specific assertions, not general claims about reasoning with consistently anti-epistemological effects.

-7Elias

At what point will you check the Karma value? The end of the year?

0FiftyTwo
Yes, same as all other predictions.

We think we'd be better at running the world because we think rationalists should be better at pretty much everything that benefits from knowing the truth. If we didn't believe that we wouldn't be (aspiring) rationalists. And just because we couldn't do it perfectly doesn't mean we're not better than the alternatives.

0lessdazed
I wonder how well a group whose members didn't study how to think and instead devoted themselves to not letting emotions interfere with their decisions would do. All its work would be advances, I think - there would be no analog to the "valley of rationality" in which people lost touch with their intuitions and made poor decisions.
6Vaniver
Overconfidence seems like a poor qualification.

Sorry for the confusion.

It was meant as a joint position of the insane people and myself, but on further consideration I'm abandoning it.

However, I don't think it's that unlikely that e.g. racial differences are fairly minimal if they exist at all, at least in terms of genetic rather than cultural/environmental/whatever differences. To the best of my knowledge, races aren't all that distinct on a genetic level, so I wouldn't call it "overwhelmingly improbably" that they would turn out to be close to indistinguishable in terms of intelligence.

That... (read more)

5Barry_Cotter
TTBOMK a grand total of two (2) men whose ancestry is not predominantly West African have ever run 100m in less than ten seconds. If you can come up with some good reasons why selection for g wouldn't have ancestral group differences that strong I'd be interested to hear them.

I mostly agree with you; I was just stating my impression of the attitudes of those raising the objections in the first place (note the quotation marks). And to be fair to them, it's really more, "believing this would cause other people to act horribly, so let's keep them from believing it."

It could also be a moral value in your utility function, in which case what looks like bias mostly falls under wishful thinking.

We are left deciding what's good and evil because if we don't, who will tell us? And even if someone did, how could we trust them? The nature of morality is such that everyone has to decide for themself, at least to the extent of deciding who to listen to. If a god has some higher purpose, they should explain it to us, and if they can't explain it in a way that makes us agree it's not right.

Just because feedback loops happen doesn't mean they're a good thing even when they happen to animals. We should be exempt under EY's definition of should, and anyone who disagrees is either using a different definition or is just not worth arguing with.

2lessdazed
Ants in a feedback loop.

I think Eliezer is missing the main cause of the uproar in cases like this. The stance of the uproarers is not that "If this was true, it would be horrible, so let's not believe it." It's more like, "believing this is true would cause people to act horribly, so let's not believe it."

Claims of innate racial and sexual differences in intelligence have historically been baseless rationalizations that attempt to justify oppressing the group in question. So now when anyone raises the question, they are shouted down because they are tarred wi... (read more)

1lessdazed
That's not just delusional, it's deluded. "believing this is true would cause people to act horribly, so let's not believe that we believe it," would be merely delusional, and hence less objectionable.
wedrifid210

because while individual differences clearly exist, group differences probably don't

Just not true. And obviously not true at that. Was this presented as "one of the crazy beliefs that some insane people have" or as your own position? Hard to keep track in there.

Group differences not existing would be such an overwhelmingly improbable occurrence that it would prompt me to second guess my atheism. The universe isn't fair. Things just don't go around being equal to each other without good reason.

My parents taught me about God the same way that they did about Santa and the Tooth Fairy, and I don't think it did me any harm. I decided for myself that God didn't exist before I figured out that my parents were atheists too, but I don't have any especially strong memories of figuring out any of the three.

The last, long one is basically saying shut up and multiply. Going with your gut intuitions might make you feel better about your decisions, but it won't really get you a better outcome. In your own life, if you want to pay a premium for that feeling of (unjustified) confidence that's one thing, but when other people's lives are at stake you have to be cold about it if you want to do what's really right.

The second and third are about how rationality for its own sake is futile. Rationality is good because it makes you better at what really matters. Your goa... (read more)

ata140

so who cares if you're factually wrong?

Anyone may care about anything they want. Particularly if there are no moral truths.

(If there are no moral truths, then what do you care if I care if people are factually wrong? ;) )

Nornagest250

You do, as long as you have subjective wants or needs that need accurate information to be met.