In response to Building Weirdtopia
Comment author: steven 13 January 2009 04:47:36PM 13 points [-]

Few of these weirdtopias seem strangely appealing in the same way that conspiratorial science seems strangely appealing.

In response to Building Weirdtopia
Comment author: steven 13 January 2009 01:12:25AM 3 points [-]

I think the most you can plausibly say is that for humanlike architectures, memories of suffering (not necessarily true ones) are necessary to appreciate pleasures more complex than heroin. Probably what matters is that there's some degree of empathy with suffering, whether or not that empathy comes from memories. Even in that weakened form the statement doesn't sound plausible to me.

Anyway it seems to me that utopianly speaking the proper psychological contrast for pleasure is sobriety rather than pain.

In response to Eutopia is Scary
Comment author: steven 12 January 2009 03:07:05PM 5 points [-]

Perhaps a benevolent singleton would cripple all means of transport faster than say horses and bicycles, so as to preserve/restore human intuitions and emotions relating to distance (far away lands and so on)?

In response to Serious Stories
Comment author: steven 09 January 2009 11:24:31AM 0 points [-]

If I'm 50% sure that the asymmetry between suffering and happiness is just because it's very difficult to make humans happy (and so in general achieving great happiness is about as important as avoiding great suffering), and 50% sure that the asymmetry is because of something intrinsic to how these things work (and so avoiding great suffering is maybe a hundred times as important), should I act in the mean time as if avoiding great suffering is slightly over 50 times as important as achieving great happiness, slightly under 2 times as important as achieving great happiness, or something in between? This is where you need the sort of moral uncertainty theory that Nick Bostrom has been working on I think.

In response to Serious Stories
Comment author: steven 09 January 2009 11:15:17AM 3 points [-]

I suspect climbing Everest is much more about effort and adventure than about actual pain. Also, the vast majority of people don't do that sort of thing as far as I know.

Comment author: steven 07 January 2009 08:51:51AM 1 point [-]

I think putting it as "eudaimonia vs simple wireheading" is kind of rhetorical; I agree eudaimonia is better than complex happy mind states that don't correspond to the outside world, but I think complex happy mind states that don't correspond to the outside world are a lot better than simple wireheading.

Comment author: steven 07 January 2009 08:48:28AM 0 points [-]

For alliances to make sense it seems to me there have to be conflicts; do you expect future people to get in each other's way a lot? I guess people could have conflicting preferences about what the whole universe should look like that couldn't be satisfied in just their own corner, but I also guess that this sort of issue would be only a small percentage of what people cared about.

In response to Growing Up is Hard
Comment author: steven 04 January 2009 07:10:43PM 0 points [-]

Patri, try "Algernon's Law"

In response to Harmful Options
Comment author: steven 25 December 2008 05:53:01AM 2 points [-]

The rickroll example actually applies to all agents, including ideal rationalists. Basically you're giving the victim an extra option that you know the victim thinks is better than it actually is. There's no reason why this would apply to humans only or to humans especially.

In response to High Challenge
Comment author: steven 19 December 2008 05:02:24PM 0 points [-]

Oh, massive crosspost.

View more: Prev | Next