To the degree "thinking" or "deciding" actually exists, it's not clear to me that we as individuals are the actual agents, rather than observer subcomponents with an inflated sense of agency, perhaps a lot like the neurons but with a deluded/hallucinated sense of agency.
J Thomas, whether or not foxes or rabbits think about morality seems to me to be the less interesting aspect of Tim Tyler's comments.
As far as can tell this is more about algorithms and persistence. I aspire to value the persistence of my own algorithm as a subjective conscious entity. I can conceive of someone else who values maximizing the persistence odds of any subjective conscious entity that has ever existed above all. A third that values maximizing the persistence odds of any human who has ever lived above all. Eliezer seems to value maximizing the persistence of a certain algorithm of morality above all (even if it deoptimizing the persistence odds of all humans who have ever lived). Optimizing the persistence odds of these various algorithms seems to me to be in conflict with each other, much the algorithm of the fox having the rabbit in it's belly is in conflict with the algorithm of the rabbit eating grass, outside of the foxes belly. It's an interesting problem, although I do of course have my own preferred solution to it.
Ben, you write "Do you strive for the condition of perfect, empty, value-less ghost in the machine, just for its own sake...?".
But my previous post clearly answered that question: "I'd sacrifice all of that reproductive fitness signalling (or whatever it is) to maximize my persistence odds as a subjective conscious entity, if that "dilemma" was presented to me."
I'm fine with a galaxy without humor, music, or art. I'd sacrifice all of that reproductive fitness signalling (or whatever it is) to maximize my persistence odds as a subjective conscious entity, if that "dilemma" was presented to me.
Daniel Reeves, I checked out your bio. Very impressive stuff, and best of success with your work and research!
Richard, Thanks, the SEP article on moral psychology was an enlightening read.
"Someone sees a slave being whipped, and it doesn't occur to them right away that slavery is wrong. But they go home and think about it, and imagine themselves in the slave's place, and finally think, "No.""
I think lines like this epitomize how messy your approach to understanding human morality as a natural phenomenon is. Richard (the pro), what resources do you recommend I look into to find people taking a more rigorous approach to understanding the phenomenon of human morality (as opposed to promoting a certain type uncritically)?
Weird, jsalvati is not my sock puppet, but the 11:16pm post above is mine.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
There's this weird hero-worship codependency that emerges between Eliezer and some of his readers that I don't get, but I have to admit, it diminishes (in my eyes) the stature of all parties involved.