It is possible to enjoy doing something while wanting to stop or vice versa, do something without enjoying it while wanting to continue. (Seriously? I can't remember ever doing either.
You should try nicotine-addiction to understand this. That's possible, because "reward" and "pleasure" are different circuits in the brain.
Why AGI safety is all about safety of AI for humans and not a word about safety for AI from humans?
your use of "average" in terms of a right is confusing to me
I can't see, what's so confusing. Let's say, that we have racial segregation in country, and we are declaring that black people should have access to all places, to which white people have access. Does it mean we want black people to have access to only those places accessible to the weakest humans (2-year-old whites and white wheelchair users). No. We want black people to have access to where normal white people have access.
...Are you saying that you believe that as long as any one machine has the a
Mostly, I'm asking for the inverse of this right: what duties does it impose on whom?
E.g., AI developers shouldn't directly prohibit the self-destructive behaviour of AI.
I was rather surprised to see you state that the current world for humans (including children and the infirm) is acceptable in terms of this right. How about animals?
What's wrong with them? Wild animals are able to suicide. Do you mean specifically domestic animals?
...Would you agree that as long as a machine has at least as much ability to suicide as the weakest hum
What do you mean by "contention between principles that would make it non-absolute"? Sorry, my English is very poor.
If to talk about "capabilities" instead of "rights", I would say the following:
Luckily, in the world we are living now, everyone who is a person at the same time is capable to suicide. Even small children. The only exception is very weak and sick people. And this group of people already caused the discussion about euthanasia in modern society.
So let's say, this is a status quo that must be protected. I.e. to not bring to existence new types o...
3. What about a virtual cemetery, where digitized human minds or human brains in jars are existing eternally in some virtual reality? Whenever such a mind decided that (s)he don't want to exist anymore, it appeared to be impossible, as due to intoxication with idea "to live is always better, than to die" in the past, noone installed a suicidal switch.
Sorry for possible problems with English.
Actually, Russia is your only chance.
It is a nuclear country with the ruler of ultimate power. None there can't argue Putin, even if he do a completely crazy bullshit. Russia will fulfill any his decision. If you'll persuade one only Putin (that may be easy, as he is conservative and scared of technology) he can threat the world with nuclear weapons to hold on the rising of AGI. Also he may listen you because this plan may help him two save the face when he lost a war ("it's not like we're lost to Ukraine, we just switched to the question of more importanc... (read more)