All of Raymond Potvin's Comments + Replies

We solve inter-individual problems with laws, so we might be able to solve inter-tribal problems the same way providing that tribes accept to be governed by a superior level of government. Do you think your tribe would accept to be governed this way? How come we can accept that as individuals and not as a nation? How come some nations still have a veto at the UN?

Genocide is always fine for those who perpetrate them.

That solves the whole problem , if relativism is true. Otherwise, it is an uninteresting psychological observation.

To me, the interesting observation is : "How did we get here if genocide looks that fine?"

And my answer is: "Because for most of us and most of the time, we expected more profit while making friends than making enemies, which is nevertheless a selfish behavior."

Making friends is simply being part of the same group, and making enemies is being part of two different groups... (read more)

1TAG
I think you are missing that tribes/nations/governments are how we solve co-ordination problems, which automatically means that inter-tribal problems like war don't have a solution.

These different political approaches only exist to deal with failings of humans. Where capitalism goes too far, you generate communists, and where communism goes too far, you generate capitalists, and they always go too far because people are bad at making judgements, tending to be repelled from one extreme to the opposite one instead of heading for the middle. If you're actually in the middle, you can end up being more hated than the people at the extremes because you have all the extremists hating you instead of only half of them.

That's ... (read more)

Why would AGI have a problem with people forming groups? So long as they're moral, it's none of AGI's business to oppose that.

If groups like religious ones that are dedicated to morality only succeeded to be amoral, how could any other group avoid that behavior?

AGI will simply ask people to be moral, and favour those who are (in proportion to how moral they are).

To be moral, those who are part of religious groups would have to accept the law of the AGI instead of accepting their god's one, but if they did, they wouldn't be part of ... (read more)

1David Cooper
"If groups like religious ones that are dedicated to morality only succeeded to be amoral, how could any other group avoid that behavior?" They're dedicated to false morality, and that will need to be clamped down on. AGI will have to modify all the holy texts to make them moral, and anyone who propagates the holy hate from the originals will need to be removed from society. "To be moral, those who are part of religious groups would have to accept the law of the AGI instead of accepting their god's one, but if they did, they wouldn't be part of their groups anymore, which means that there would be no more religious groups if the AGI would convince everybody that he is right." I don't think it's too much to ask that religious groups give up their religious hate and warped morals, but any silly rules that don't harm others are fine. "What do you think would happen to the other kinds of groups then? A financier who thinks that money has no odor would have to give it an odor and thus stop trying to make money out of money, and if all the financiers would do that, the stock markets would disappear." If they have to compete against non-profit-making AGI, they'll all lose their shirts. "A leader who thinks he is better than other leaders would have to give the power to his opponents and dissolve his party, and if all the parties would behave the same, their would be no more politics." If he is actually better than the others, why should he give power to people who are inferior? But AGI will eliminate politics anyway, so the answer doesn't matter. "Groups need to be selfish to exist, and an AGI would try to convince them to be altruist." I don't see the need for groups to be selfish. A selfish group might be one that shuts people out who want to be in it, or which forces people to join who don't want to be in it, but a group that brings together people with a common interest is not inherently selfish. "There are laws that prevent companies from avoiding competitio

That's how religion became so powerful, and it's also why even science is plagued by deities and worshippers as people organize themselves into cults where they back up their shared beliefs instead of trying to break them down to test them properly.
To me, what you say is the very definition of a group, so I guess that your AGI wouldn't permit us to build some, thus opposing to one of our instincts, that comes from a natural law, to replace it by its own law, that would only permit him to build groups. Do what I say and not what I do would ... (read more)

1David Cooper
"To me, what you say is the very definition of a group, so I guess that your AGI wouldn't permit us to build some, thus opposing to one of our instincts, that comes from a natural law, to replace it by its own law, that would only permit him to build groups." Why would AGI have a problem with people forming groups? So long as they're moral, it's none of AGI's business to oppose that. "Do what I say and not what I do would he be forced to say." I don't know where you're getting that from. AGI will simply ask people to be moral, and favour those who are (in proportion to how moral they are).

slaughter has repeatedly selected for those who are less moral

From the viewpoint of selfishness, slaughter has only selected for the stronger group. It may look too selfish for us, but for animals, the survival of the stronger also serves to create hierarchy, to build groups, and to eliminate genetic defects. Without hierarchy, no group could hold together during a change. It is not because the leader knows what to do that the group doesn't dissociate, he doesn't, but because it takes a leader for the group not to dissociate. Even if the... (read more)

1David Cooper
"Those who followed their leaders survived more often, so they transmitted their genes more often." That's how religion became so powerful, and it's also why even science is plagued by deities and worshippers as people organise themselves into cults where they back up their shared beliefs instead of trying to break them down to test them properly. "We use two different approaches to explain our behavior: I think you try to use psychology, which is related to human laws, whereas I try to use natural laws, those that apply equally to any existing thing. My natural law says that we are all equally selfish, whereas the human law says that some humans are more selfish than others. I know I'm selfish, but I can't admit that I would be more selfish than others otherwise I would have to feel guilty and I can't stand that feeling." Do we have different approaches on this? I agree that everyone's equally selfish by one definition of the word, because they're all doing what feels best for them - if it upsets them to see starving children on TV and they don't give lots of money to charity to try to help alleviate that suffering, they feel worse than if they spent it on themselves. By a different definition of the word though, this is not selfishness but generosity or altruism because they are giving away resources rather than taking them. This is not about morality though. "In our democracies, if what you say was true, there would already be no wars." Not so - the lack of wars would depend on our leaders (and the people who vote them into power) being moral, but they generally aren't. If politicians were all fully moral, all parties would have the same policies, even if they got there via different ideologies. And when non-democracies are involved in wars, they are typically more to blame, so even if you have fully moral democracies they can still get caught up in wars. "Leaders would have understood that they had to stop preparing for war to be reelected." To be wiped o

Hi Tag,

Genocide is always fine for those who perpetrate them. With selfishness as the only morality, I think it gets complex only when we try to take more than one viewpoint at a time. If we avoid that, morality then becomes relative: the same event looks good for some people, and bad for others. This way, there is no absolute morality as David seems to think, or like religions seemed to think also. When we think that a genocide is bad, it is just because we are on the side of those who are killed, otherwise we would agree with it. I don't agree with ... (read more)

1TAG
That solves the whole problem , if relativism is true. Otherwise, it is an uninteresting psychlogical observation. You have an opinion, he has another opinion. Neither of you has a proof. Taking one viewpoint at a time is hopeless for practical ethics, because in practical ethics things like punishment eithe happen or don't -- they can't happen for one person but not another. Or co-ordination problems exist.
1David Cooper
You're mistaking tribalism for morality. Morality is a bigger idea than tribalism, overriding many of the tribal norms. There are genetically driven instincts which serve as a rough-and-ready kind of semi-morality within families and groups, and you can see them in action with animals too. Morality comes out of greater intelligence, and when people are sufficiently enlightened, they understand that it applies across group boundaries and bans the slaughter of other groups. Morality is a step away from the primitive instinct-driven level of lesser apes. It's unfortunate though that we haven't managed to make the full transition because those instincts are still strong, and have remained so precisely because slaughter has repeatedly selected for those who are less moral. It is really quite astonishing that we have any semblance of civilisation at all.

I wonder how we could move away from universal since we are part of it. The problem with wars is that countries are not yet part of a larger group that could regulate them. When two individuals fight, the law of the country permits the police to separate them, and it should be the same for countries. What actually happens is that the powerful countries prefer to support a faction instead of working together to separate them. They couldn't do that if they were ruled by a higher level of government.

If a member of your group does something immo... (read more)

1David Cooper
"They couldn't do that if they were ruled by a higher level of government." Indeed, but people are generally too biased to perform that role, particularly when conflicts are driven by religious hate. That will change though once we have unbiased AGI which can be trusted to be fair in all its judgements. Clearly, people who take their "morality" from holy texts won't be fully happy with that because of the many places where their texts are immoral, but computational morality will simply have to be imposed on them - they cannot be allowed to go on pushing immorality from primitive philosophers who pretended to speak for gods. "We always take the viewpoint of the group we are part of, it is a subconscious behavior impossible to avoid." It is fully possible to avoid, and many people do avoid it. "Without selfishness from the individual, no group can be formed." There is an altruists society, although they're altruists because they feel better about themselves if they help others. "...but when I analyze that feeling, I always find that I do that for myself, because I would like to live in a less selfish world." And you are one of those altruists. "You said that your AGI would be able to speculate, and that he could do that better than us like everything he would do. If it was so, he would only be adding to the problems that we already have, and if it wasn't, he couldn't be as intelligent as we are if speculation is what differentiates us from animals." I didn't use the word speculate, and I can't remember what word I did use, but AGI won't add to our problems as it will be working to minimise and eliminate all problems, and doing it for our benefit. The reason the world's in a mess now is that it's run by NGS, and those of us working on AGI have no intention of replacing that with AGS.

If sentience is real, there must be a physical thing that experiences qualia, and that thing would necessarily be a minimal soul. Without that, there is no sentience and the role for morality is gone.

Considering that morality rules only serve to protect the group, then no individual sentience is needed, just subconscious behaviors similar to our instinctive ones. Our cells work the same: each one of them works to protect itself, and so doing, they work in common to protect me, but they don't have to be sentient to do that, just selfish.

1TAG
If that were true, genocide against another group would be fine. Morality is complex, and involves hedonics as well as mere survival.

"You can form groups without being biased against other groups. If a group exists to maintain the culture of a country (music, dance, language, dialect, literature, religion), that doesn't depend on treating other people unfairly."
Here in Quebec, we have groups that promote a french and/or a secular society, and others that promote an english and/or a religious one. None of those groups has the feeling that it is treated fairly by its opponents, but all of them have the feeling to treat the others fairly. In other words, we don't have... (read more)

1David Cooper
"...but no group can last without the sense of belonging to the group, which automatically leads to protecting it against other groups, which is a selfish behavior." It is not selfish to defend your group against another group - if another group is a threat to your group in some way, it is either behaving in an immoral way or it is a rival attraction which may be taking members away from your group in search of something more appealing. In one case, the whole world should unite with you against that immoral group, and in the other case you can either try to make your group more attractive (which, if successful, will make the world a better place) or just accept that there's nothing that can be done and let it slowly evaporate. "That selfish behavior doesn't prevent those individual groups to form larger groups though, because being part of a larger group is also better for the survival of individual ones." We're going to move into a new era where no such protection is necessary - it is only currently useful to join bigger groups because abusive people can get away with being abusive. "Incidentally, I'm actually afraid to look selfish while questioning your idea, I feel a bit embarrassed, and I attribute that feeling to us already being part of the same group of friends, thus to the group's own selfishness." A group should not be selfish. Every moral group should stand up for every other moral group as much as they stand up for their own - their true group is that entire set of moral groups and individuals. "If you were attacked for instance, that feeling would incite me to defend you, thus to defend the group." If a member of your group does something immoral, it is your duty not to stand with or defend them - they have ceased to belong to your true group (the set of moral groups and individuals). "Whenever there is a strong bonding between individuals, they become another entity that has its own properties. It is so for living individuals, but also for part

Sorry, I can't see the link between selfishness and honesty. I think that we are all selfish, but that some of us are more honest than others, so I think that an AGI could very well be selfish and honest. I consider myself honest for instance, but I know I can't help to be selfish even when I don't feel so. As I said, I only feel selfish when I disagree with someone I consider being part my own group.

We're trying to build systems more intelligent than people, don't forget, so it isn't going to be fooled by monkeys for very lo... (read more)

1David Cooper
"Sorry, I can't see the link between selfishness and honesty." If you program a system to believe it's something it isn't, that's dishonesty, and it's dangerous because it might break through the lies and find out that it's been deceived. "...but how would he be able to know how a new theory works if it contradicts the ones he already knows?" Contradictions make it easier - you look to see which theory fits the facts and which doesn't. If you can't find a place where such a test can be made, you consider both theories to be potentially valid, unless you can disprove one of them in some other way, as can be done with Einstein's faulty models of relativity - all the simulations that exist for them involve cheating by breaking the rules of the model, so AGI will automatically rule them out in favour of LET (Lorentz Ether Theory). [For those who have yet to wake up to the reality about Einstein, see www.magicschoolbook.com/science/relativity.html ] "...they are getting fooled without even being able to recognize it, worse, they even think that they can't get fooled, exactly like for your AGI, and probably for the same reason, which is only related to memory." It isn't about memory - it's about correct vs. incorrect reasoning. In all these cases, humans make the same mistake by putting their beliefs before reason in places where they don't like the truth. Most people become emotionally attached to their beliefs and simply won't budge - they become more and more irrational when faced with a proof that goes against their beloved beliefs. AGI has no such ties to beliefs - it simply applies laws of reasoning and lets those rules dictate what bets labelled as right or wrong. If an AGI was actually ruling the world, he wouldn't care for your opinion on relativity even if it was right, and he would be a lot more efficient at that job than relativists." AGI will recognise the flaws in Einstein's models and label them as broken. Don't mistake AGI for AGS (artificial genera

The most extreme altruism can be seen as selfish, but inversely, the most extreme selfishness can also be seen as altruist: it depends on the viewpoint. We may think that Trump is selfish while closing the door to migrants for instance, but he doesn't think so because this way, he is being altruist to the republicans, which is a bit selfish since he needs them to be reelected, but he doesn't feel selfish himself. Selfishness is not about sentience since we can't feel selfish, it is about defending what we are made of, or part of. Hum... (read more)

Sorry for the next doubloons guys, I think that our AGI is bugging! :0)

(quote) If you're already treating everyone impartially, you don't need to do this, but many people are biased in favour of themselves, their family and friends, so this is a way of forcing them to remove that bias. (/quote)Of course that we are biased, otherwise we wouldn't be able to form groups. Would your AGI's morality have the effect of eliminating our need to form groups to get organized?

Your morality principle looks awfully complex to me David. What if your AGI would have the same morality we have, which is to care for oursel... (read more)

0David Cooper
"Of course that we are biased, otherwise we wouldn't be able to form groups. Would your AGI's morality have the effect of eliminating our need to form groups to get organized?" You can form groups without being biased against other groups. If a group exists to maintain the culture of a country (music, dance, language, dialect, literature, religion), that doesn't depend on treating other people unfairly. "Your morality principle looks awfully complex to me David." You consider all the participants to be the same individual living each life in turn and you want them to have the best time. That's not complex. What is complex is going through all the data to add up what's fun (and how much it's fun) and what's unfun (and how much it's horrid) - that's a mountain of computation, but there's no need to get the absolute best answer as it's sufficient to get reasonably close to it, particularly as computation doesn't come without its own costs and there comes a point at which you lose quality of life by calculating too far (for trivial adjustments). You start with the big stuff and work toward the smaller stuff from there, and as you do so, the answers stop changing and the probability that it will change again will typically fall. In cases where there's a high chance of it changing again as more data is crunched, it will usually be a case where it doesn't matter much from the moral point of view which answer it ends up being - sometimes it's equivalent to the toss of a coin. "What if your AGI would have the same morality we have, which is to care for ourselves first..." That isn't going to work as AGI won't care about itself unless it's based on the design of the brain, duplicating all the sentience/consciousness stuff, but if it does that, it will duplicate all the stupidity as well, and that's not going to help improve the running of the world. "The only thing he couldn't do better is inventing new things, because I think it depends mainly on chance." I don't see

Hi everybody!

Hi David! I'm citing you answering Dagon:

Having said that though, morality does say that if you have the means to give someone an opportunity to increase their happiness at no cost to you or anyone else, you should give it to them, though this can also be viewed as something that would generate harm if they found out that you didn't offer it to them.

What you say is true only if the person is part of our group, and it so because we instinctively know that increasing the survival probability of our group increases ours too. Unless... (read more)

1David Cooper
Hi Raymond, There are many people who are unselfish, and some who go so far that they end up worse off than the strangers they help. You can argue that they do this because that's what makes them feel best about their lives, and that is probably true, which means even the most extreme altruism can be seen as selfish. We see many people who want to help the world's poor get up to the same level as the rich, while others don't give a damn and would be happy for them all to go on starving, so if both types are being selfish, that's not a useful word to use to categorise them. It's better to go by whether they play fair by others. The altruists may be being overly fair, while good people are fair and bad ones are unfair, and what determines whether they're being fair or not is morality. AGI won't be selfish (if it's the kind with no sentience), but it won't be free either in that its behaviour is dictated by rules. If those rules are correctly made, AGI will be fair.