Posts

Sorted by New

Wiki Contributions

Comments

I've read countless papers on crypto, and they mostly seem pretty formal to me - what are people comparing them to? Is it really worse in other fields? There is some variation - DJB's style is distinctly less formal than other authors - but my perception is that papers on for example network engineering seem a lot less formal than crypto papers. I think there's plenty of room to improve the readability of crypto papers by encouraging less formality.

One trivial example of signalling here is the way everyone still uses the Computer Modern font. This is a terrible font, and it's trivial to improve the readability of your paper by using, say, Times New Roman instead, but Computer Modern says that you're a serious academic in a formal field.

I'd have more sympathy with Luke (and thus more forgiveness for Lucas) if instead of the whole X-Wing moving when he tries it, we see a much less dramatic effect; perhaps aerials that were drooping stand up, or the flaps lift gently, or some such.

However, in such films the plausibility of the character's behaviour is always sacrificed in the interests of better visuals, or better drama; cf the zillion ludicrous excuses scriptwriters present for characters not telling each other what's going on.

Regret of rationality in games isn't a mysterious phenomenon. Let's suppose that after the one round of PD we're going to play I have the power to destroy a billion paperclips at the cost of one human life, and Clippy knows that. If Clippy thinks I'm a rational outcome-maximizer, then he knows that whatever threats I make I'm not going to carry out, because they won't have any payoffs when the time comes. But if it thinks I'm prone to irrational emotional reactions, it might conclude I'll carry out my billion-paperclip threat if it defects, and so cooperate.

If I could prevent only one of these events, I would prevent the lottery.

I'm assuming that this is in a world where there are no payoffs to the LHC; we could imagine a world in which it's decided that switching the LHC on is too risky, but before it is mothballed a group of rogue physicists try to do the riskiest experiment they can think of on it out of sheer ennui.

In what context is $10 trillion not a huge amount of money? It's approximately the entire US national debt, but the difference is nearly enough to pay off the entire debt of the third world; it's what the UK governent spends in ten years. If I had that kind of wealth, after I'd cleared all third world debts, I'd carpet developing nations everywhere with infrastructure like roads and such, and pay for literacy and clean water everywhere, and I'd still have money left over.

Nick Tarleton: sadly, it's my experience that it's futile to try and throw flour over the dragon.

Tomorrow I will address myself to accusations I have encountered that decoherence is "unfalsifiable" or "untestable", as the words "falsifiable" and "testable" have (even simpler) probability-theoretic meanings which would seem to be violated by this usage.

Doesn't this follow trivially from the above? No experiment can determine whether or not we have souls, but that counts against the idea of souls, not against the idea of their absence. If decoherence is the simpler theory, then lack of falsifiability counts against the other guys, not against it.

Roland: yes, at least one. Where did you give up and why?

This is what I thought at first, but on reflection, it's not quite right.

Could you explain a little more the distinction between the position preceding this remark and that following it? They seem like different formulations of the same thing to me.

Heterophenomenology!

Sorry, I thought it needed saying.

Load More