I'm curious, now, as to what nation or state you live in.
I live in Illinois. I am curious as to y you are curious.
Would you kill others (who were themselves also going to die if you did nothing) and allow yourself to die, if it would save people you cared about?
Probably. For instance, I would try to defend my wife/child from imminent physical harm even if it put me in a lot of danger. If that meant trying to kill someone then I would do that but in that case it would be justifiable and I probably wouldn't go to prison if I survived.
what would your judgment of the rightfulness of carrying out the action yourself int he absence of democratic systems be?
I feel like we are doomed to talk about different things. I think you are talking about "morally right" which I don't usually think about unless I am trying to convince someone to do something against their own interest. I observe that large democratic governments deliberately kill people all the time without consequence. I also observe that individuals have more trouble doing so. Consequently, I think that individuals trying to kill people is a bad idea. So its not right in the same sense that exercising a 60 delta call 3 mos from expiration is not right.
Why allow your opinions to be swayed by the emotional responses of others?
My opinions are unaffected but my actions might be. If I am telling jokes and everyone is staring at me stone faced I'm likely to stop.
Probably. For instance, I would try to defend my wife/child from imminent physical harm even if it put me in a lot of danger.
How many people you didn't know would you equate to being "of equal concern" to you as one person you do know when deciding whether or not it's worth it to risk your own life to save them? Please express this as a ratio -- unknowns:knowns -- and then, if you like, knowns:loveds.
Here's a poser that occurred to us over the summer, and one that we couldn't really come up with any satisfactory solution to. The people who work at the Singularity Institute have a high estimate of the probability that an Unfriendly AI will destroy the world. People who work for http://nuclearrisk.org/ have a very high estimate of the probability that a nuclear war will destroy the world (by their estimates, if you are American and under 40, then nuclear war is the single most likely way in which you might die next year).
It seems like there are good reasons to take these numbers seriously, because Eliezer is probably the world expert on AI risk, and Hellman is probably the world expert on nuclear risk. However, there's a problem - Eliezer is an expert on AI risk because he believes that AI risk is a bigger risk than nuclear war. Similarly, Hellman chose to study nuclear risks and not AI risk I because he had a higher than average estimate of the threat of nuclear war.
It seems like it might be a good idea to know what the probability of each of these risks is. Is there a sensible way for these people to correct for the fact that the people studying these risks are those that have high estimate of them in the first place?