Are human ethics/morals just an evolutionary mess of incomplete and inconsistent heuristics? One idea I heard that made sense is that evolution for us was optimizing our emotions for long term 'fairness'. I got a sense of it when watching the monkey fairness experiment
My issue is with 'friendly ai'. If our ethics are inconsistent then we won't be choosing a good AI but instead the least bad one. A crap sandwich either way.
The worst part is that we will have to hurry to be the first to AI or some other culture will select the dominate AI.
One idea I heard that made sense is that evolution for us was optimizing our emotions for long term 'fairness'. I got a sense of it when watching the monkey fairness experiment
Evolution is optimizing us for inclusive genetic fitness. Anything else is just a means to an end.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Previous Open Thread
Next Open Thread
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.