Well Blindsight impressed me enough, that I've started The Ego Tunnel. In short, the idea of unconscious intelligence bothered me. My intuition says that consciousness could be what happens when something tries to model its intelligence and actions, but of course that hardly explains anything. While I feel like it's unlikely I'll find many good answers, it is interesting enough to be enjoyable to read.
Alright. I've read the first few page or so of the first link "Consciousness and its Place in Nature", and it seems to boil down to "We can think of zombies without our current minds seeing any major issue (a priori!), therefore consciousness isn't physical."
That sums up the current state of knowledge
Which was sort of my question: Do I have a whole lot to gain by reading the current information available? Will I obtain valuable insights on things, or even be rather entertained? Or am I just gonna end up in the same place, but with a deeper respect for how difficult it is to figure things out?
OK, given the strong reaction to my comment I will check it out. I'd love to be in for a big update, but the whole zombie thing is so generally perplexing how anyone can take that seriously without being outright dualistic that it'll really be a huge update for me.
Thanks for the reply. Yes I found out the term is "negative utilitarianism". I suppose I can search and find rebuttals of that concept. I didn't mean that the function was "if suffering > 0 then 0", just that suffering should be a massively dominating term, so that no possible worlds with real suffering outrank worlds with less suffering.
As to your question about my personal preference on life, it really depends on the level of suffering. At the moment, no, things are alright. But it has not always been that way, and it's not hard to see it crossing over again.
I would definitely obliterate everyone on Earth, though, and would view not doing so, if capable, to be immoral. Purely because so many sentient creatures are undergoing a terrible existence, and the fact that you and me are having an alright time doesn't make up for it.
Good points. But I'm thinking that the pain of death is purely because of the loss others feel. So if I could eliminate my entire family and everyone they know (which ends up pulling essentially every person alive into the graph), painlessly and quickly, I'd do it.
The bug of scope insensitivity doesn't apply if everyone gets wiped out nicely, because then the total suffering is 0. So, for instance, grey goo taking over the world in an hour - that'd cause a spike of suffering, but then levels drop to 0, so I think it's alright. Whereas an asteroid that kills 90% of people, that'd leave a huge amount of suffering left for the survivors.
In short, the pain of one child dying is the sum of the pain others feel, not an intrinsic to that child dying. So if you shut up and multiply with everyone dying, you get 0. Right?
If the suffering "rounds down" to 0 for everyone, sure, A is fine. That is, a bit of pain in order to keep Fun. But no hellish levels of suffering for anyone. Otherwise, B. Given how the world currently looks, and MWI, it's hard to see how it's possible to end up with everyone having pain that rounds down to 0.
So given the current world and my current understanding, if someone gave me a button to press that'd eliminate earth in a minute or so, I'd press it without hesitation.
It is if we define a utility function with a strict failure mode for TotalSuffering > 0. Non-existent people don't really count, do they?
That sort of confirms my suspicion - that it's a very active topic. And it's not necessarily easy to break into. I was hoping there was a good pop-sci summary book that laid things out real nicely. Like what The Selfish Gene does for evolution. But I read the book Blindsight, and am now reading Metzinger's The Ego Tunnel, just because it seemed incredibly interesting. So who knows how deep this will go for me :)