DanArmak comments on The impact of whole brain emulation - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (34)
I would think that we would probably want to have cultural, ethical, and legal rules against infinitely copying yourself. For one thing, that leads to the rather dystpoian situation Robert Hanson was talking about; and for another, it would lead to a rapidly diminishing amount of variety among humans, which would be sad. One or two copies of you might be ok, but would you really want to live in a world where there are billions of copies of you, billions of copies of Von Neumann, and almost no one else to talk to? Remember, you are now immortal, and the amount of subjective time you are going to live is going to be vast; boredom could be a huge problem, and you would want a huge variety of people to interact with and be social with, wouldn't you?
I really think that we wouldn't want to allow an large amount of copying of the exact same mind to happen.
Nothing short of a very powerful singleton could stop competing, intelligent, computation-based agents from using all available computation resources. If the most efficient way to use them is to parallelize many small instances, then that's what they'll do. How do you stop people from running whatever code they please?
I don't see any reason why a society of computing, intelligent, computation-based agents wouldn't be able to prevent any single computation-based agent from doing something they want to make illegal. You don't need a singleton, a society of laws probably works just fine.
And, in fact, you would probably have to have laws and things like that, unless you want other people hacking into your mind.
For society to be sure of what code you're running, they need to enforce transparency that ultimately extends to the physical, hardware level. Even if there are laws, to enforce them I need to know you haven't secretly built custom hardware that would give you an illegal advantage, which falsely reports that it's running something else and legal. In the limit of a nano-technology-based, AGI scenario, this means verifying the actual configurations of atoms of all matter everyone controls.
A singleton isn't required, but it seems like the only stable solution.
Well, you don't have to assume that 100% of all violations of laws will be caught to get a stable society. Just that enough of them are caught to deter most potential criminals.
It depends on a lot of variables, of course, most of which we don't know yet. But, hypothetically speaking, if the society of EM's we're talking about are running on the same network (or the same mega-computer, or whatever), then it should be pretty obvious if someone suddenly makes a dozen illegal copies of themselves and suddenly starts using far more network resources then they were a short time ago.
That's a tradeoff vs. the benefit to a criminal who isn't caught from the crime. The benefit here could be enormous.
I was assuming that creating illegal copies lets you use the same resources more intelligently, and profit more for them. Also, if your only measurable is the amount of resource use and not the exact kind of use (because you don't have radical transparency), then people could acquire resources first and convert them to illegal use later.
Network resources are externally visible, but the exact code you're running internally isn't. You can purchase resources first and illegally repurpose them later, etc.