Posts

Sorted by New

Wiki Contributions

Comments

@ frelkins:

Eliezer's brand of humanism seems to consist in endorsing many of the values of traditional humanism while ditching the metaphysics. Jaron seemed to think the metaphysical stuff - specifically, psychological dualism of some sort - is indispensible. I'm not sure who should have proprietory rights over the word, but that argument is surely more about brand ownership rather than anything deep. And surely there's little enough to recommend dualism in itself.

Jaron's epistemic caution also struck me as being slightly odd. It's one thing to beware delusion, accept the likelihood of substantial error and press on as best you can, revising beliefs where necessary. But Jaron seemed to be moving into more sceptical territory, saying in effect that if it seems P, that isn't grounds for believing P - because of "epistemology".

Can you unpack the stuff about consciousness, free-will and indeterminism a bit? Consciousness is the bit that's usually taken as evidence for dualism. Why talk about the other things? Free will is a busted flush anyway, isn't it? Never mind the intricacies of the physics; we automatically accept responsibility for bodily movements which have been artificially induced. I'm sure Jaron knows a great deal more about all this than I do but, from an interested outsider's perspective, it isn't at all clear how these notions are meant to hang together.

Lanier struck me as a sort of latterday Rorty: broadly a pragmatist; suspicious about the rigidity of linguistic meaning; unwilling to try to refute big visions but rather inclined to imply that he finds them silly and that perhaps any decently civilized person should do too.

The trouble with this outlook is, if your sense of what's silly is itself miscalibrated, there's not much anyone can do to help you. Moreover if meaning really is too slippery and amorphous to make debating big visions worthwhile, presumably the bright thing to do would be to avoid those debates altogether. As opposed to turning up and chuckling through them.

I wonder what Robin made of the discussion, perceived silliness being one of his hot buttons and all.

For what it's worth, I still find the mock-mysticism stuff fairly entertaining. Hard to think of another running joke that would stay good over the same duration.

Lest we forget: http://lesswrong.com/static/imported/2008/03/27/elimonk2darker.jpg

@ Caroline: the effect on overall human fitness is neither here nor there, surely. The revolutionary power cycle would be adaptive because of its effect on the reproductive success of those who play the game versus those who don't. That is, the adaptation would only have to benefit specific lineages, not the whole species. Or have I missed your point?

Eliezer: presumably there's an amount of money sufficient to induce (for example) you to bash out a three-act movie script about AI. So if demand is predicted to cover your fee plus the rest of the movie budget, Hollywood has the ability.

No, I suppose you're right, insofar as there's no fixed initial quantity to be divided. But both involve an equal apportioning of something: money to workers in the one case, and money to man-hours in the other. The parable doesn't undermine the notion that equality is essential to all concepts of fairness, even where different versions license different outcomes.

The workers in the vineyard presumably expected that a different sort of equality was in effect - for instance, equal freedom to work at an equal rate.

When I say "equal division of something", the something isn't necessarily the pie itself.

While it seems intuitively pretty clear that fairness involves an equal division of something - be it pie, meta-pie or whatever - there seems to be an embarrassment of plausible candidates for the quantity to be divided. Which is fairer: an equal distribution of goods, of opportunities or of utility? If I read him right, Eliezer would recommend deciding this question by first doling out an equal distribution of votes. But that just palms the dilemma off onto the voters.

Hardly the most profound addendum, I know, but dummy numbers can be useful for illustrative purposes - for instance, to show how steeply probabilities decline as claims are conjoined.

Perhaps I'm being dim, but a prior is a probability distribution, isn't it? Whereas Occam's Razor and induction aren't: they're rules for how to estimate prior probability. Or have I lost you somewhere?

Load More