Comment author: nazgulnarsil3 20 January 2009 10:59:57PM 1 point [-]

rw: methods of short circuiting the sex drive falls into two categories. the first would be controlling sensory input (holodecks/virtual reality and or cyborgs). the second is bypassing the senses and directly messing with the brain itself via implants or genetic manipulation.

the second type is more prone to unintended consequences than the first.

Comment author: nazgulnarsil3 20 January 2009 09:44:33AM 5 points [-]

Our drive to do better than our neighbor is a deeply ingrained metric of how we judge ourselves. In essence we recognize that our own assessment is biased and look for cues from others. Eliminating this seems like eliminating past of the foundation of a social species.

I think you're being remarkably binary about this. I think it more realistic that non-sentient sexdroids will enable healthier relationships. When people get the urge to procreate with fitter partners they can just spend an afternoon in the holodeck. I see what you're saying as advocating keeping people a little hungry so that they appreciate food more.

Comment author: nazgulnarsil3 20 January 2009 09:19:34AM 1 point [-]

I thought a big part of the appeal of the super villain fantasy wasn't your standard of living but in comparative standard of living. It's boring if everyone has a volcano lair. People want a doomsday weapon so that they are feared and respected.

Comment author: nazgulnarsil3 11 January 2009 03:06:13AM 1 point [-]

an investment earning 2% annual interest for 12,000 years adds up to a googol (10^100) times as much wealth.

no it adds up to a googol of economic units. in all likelihood the actual *wealth* that the investment represents will stay roughly the same or grow and shrink within fairly small margins.

it seems you conclude with an either/or on subjective experience improvement and brain tinkering. I think it more likely that we will improve our subjective experience up to a certain point of feasibility and then start with the brain tinkering. Some will clock-out by wireheading themselves, but most won't. Some will be more disposed towards brain tinkering, some will plug themselves into experience machines instead. The average person will do a little of both, trying various brain modifications the way we try drugs today. Will this be dangerous? Well the first people to try a new drug are taking a big risk, but the guinea pigs are a small minority. And they will use experience machines, but most won't surrender to them, just like most don't die playing world of warcraft today.

Comment author: nazgulnarsil3 07 January 2009 06:41:24AM 0 points [-]

so would you be for or against an AI that inserted us into an experience machine programmed to provide a life of maximum self expression without our knowledge?

In response to A New Day
Comment author: nazgulnarsil3 31 December 2008 09:18:15PM -1 points [-]

the value of this is most easily demonstrated in daydream scenarios. I'm guessing that other people, like me, find themselves going through some of the same fantasies time and time again, whether they be about wealth, sex, prestige or whatever else. A few days ago I banished all these familiar fantasies and spent some time thinking up new ones. Not only was it a wonderfully fun exercise, it seemed to increase my creativity when doing other activities throughout the day.

Comment author: nazgulnarsil3 28 December 2008 10:01:10PM 2 points [-]

the difference between reality and this hypothetical scenario is where control resides. I take no issue with the decentralized future roulette we are playing when we have this or that kid with this or that person. all my study of economics and natural selection indicates that such decentralized methods are self-correcting. in this scenario we approach the point where the future cone could have this or that bit snuffed by the decision of a singleton (or a functional equivalent), advocating that *this* sort of thing be slowed down so that we can weigh the decisions carefully seems prudent. isn't this sort of the main thrust of the friendly AI debate?

Comment author: nazgulnarsil3 28 December 2008 09:38:24PM 0 points [-]

what effect would it have on the point

if rewinding is morally unacceptable (erasing could-have-been sentients) and you have unlimited power to direct the future, does this mean that all the could-have-beens from futures you didn't select are on your shoulders? This is directly related to another recent post. If I choose a future with less sentients who have a higher standard of living am I responsible for the sentients that would have existed in a future where I chose to let a higher number of them be created? If you're a utilitarian this is *the* delicate point. at what point are two sentients with a certain happiness level worth one sentient with a higher happiness level? Does a starving man steal bread to feed his family? This turns into: Should we legitimize stealing from the baker to feed as many poor as we can?

Comment author: nazgulnarsil3 28 December 2008 08:18:39PM 1 point [-]

Actually it sounds pretty unlikely to me, considering the laws of thermodynamics as far as I know them.

you can make entropy run in reverse in one area as long as a compensating amount of entropy is generated somewhere within the system. what do you think a refrigerator is? what if the extra entropy that needs to be generated in order to rewind is shunted off to some distant corner of the universe that doesn't affect the area you are worried about? I'm not talking about literally making time go in reverse. You can achieve what is functionally the same thing by reversing all the atomic reactions within a volume and shunting the entropy generated by the energy you used to do this to some other area.

Comment author: nazgulnarsil3 28 December 2008 06:33:45PM 1 point [-]

I think it's worth noting that truly unlimited power means being able to undo anything. But is it wrong to rewind when things go south? if you rewind far enough you'll be erasing lives and conjuring up new different ones. Is rewinding back to before an AI explodes into a zillion copies morally equivalent to destroying them in this direction of time? unlimited power is unlimited ability to direct the future. Are the lives on every path you don't choose "on your shoulders" so to speak?

View more: Prev | Next