You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanArmak comments on The impact of whole brain emulation - Less Wrong Discussion

3 Post author: jkaufman 14 May 2013 07:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (34)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanArmak 20 May 2013 06:04:23PM -1 points [-]

Nothing short of a very powerful singleton could stop competing, intelligent, computation-based agents from using all available computation resources. If the most efficient way to use them is to parallelize many small instances, then that's what they'll do. How do you stop people from running whatever code they please?

Comment author: Yosarian2 21 May 2013 09:45:43PM 0 points [-]

Nothing short of a very powerful singleton could stop competing, intelligent, computation-based agents from using all available computation resources.

I don't see any reason why a society of computing, intelligent, computation-based agents wouldn't be able to prevent any single computation-based agent from doing something they want to make illegal. You don't need a singleton, a society of laws probably works just fine.

And, in fact, you would probably have to have laws and things like that, unless you want other people hacking into your mind.

Comment author: DanArmak 22 May 2013 09:49:35AM 0 points [-]

For society to be sure of what code you're running, they need to enforce transparency that ultimately extends to the physical, hardware level. Even if there are laws, to enforce them I need to know you haven't secretly built custom hardware that would give you an illegal advantage, which falsely reports that it's running something else and legal. In the limit of a nano-technology-based, AGI scenario, this means verifying the actual configurations of atoms of all matter everyone controls.

A singleton isn't required, but it seems like the only stable solution.

Comment author: Yosarian2 23 May 2013 03:38:24AM 0 points [-]

Well, you don't have to assume that 100% of all violations of laws will be caught to get a stable society. Just that enough of them are caught to deter most potential criminals.

It depends on a lot of variables, of course, most of which we don't know yet. But, hypothetically speaking, if the society of EM's we're talking about are running on the same network (or the same mega-computer, or whatever), then it should be pretty obvious if someone suddenly makes a dozen illegal copies of themselves and suddenly starts using far more network resources then they were a short time ago.

Comment author: DanArmak 23 May 2013 07:19:06AM 0 points [-]

Well, you don't have to assume that 100% of all violations of laws will be caught to get a stable society. Just that enough of them are caught to deter most potential criminals.

That's a tradeoff vs. the benefit to a criminal who isn't caught from the crime. The benefit here could be enormous.

it should be pretty obvious if someone suddenly makes a dozen illegal copies of themselves and suddenly starts using far more network resources then they were a short time ago.

I was assuming that creating illegal copies lets you use the same resources more intelligently, and profit more for them. Also, if your only measurable is the amount of resource use and not the exact kind of use (because you don't have radical transparency), then people could acquire resources first and convert them to illegal use later.

Network resources are externally visible, but the exact code you're running internally isn't. You can purchase resources first and illegally repurpose them later, etc.