Grognor,
Thanks for your reply. You are right you are consistent as you did admit in your second scenario that you would let the sickos have their fun.
I would like to continue the discussion on why my problem is wrong in a friendly and respectable way, but the negative score points really are threatening my ability to post, which is quite unfortunate.
Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities
Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons.
I do sincerely apologize...
Richard
I sincerely appreciate your reply. Why do we accept Omega in Eleizers thought experiment and not mine? In the original some people claim to obviously pick torture, yet unwilling to pick rape because why? Well, like you said, you refuse to believe that rapist suffer. That is fair. But if that is fair, then Bob might refuse to believe that people with specks in their eyes suffer as well...
You can not assign rules for one and not the other.
All you're saying is "suppose were actually good"? Well, suppose away. So what?
Not true. ...
Then you are not consistent. For one example you are willing to allow suffering because the 50 years of torture is less than 3^^^3 dust holocaust yet. You claim that suffering is suffering. Yet only 10 deprived rapist already has you changing your thoughts.
I do not have an answer. If anything I would consider my self a weak dusk specker. The only thing that I claim is I am not arrogant, I am consistent in my stance. I do not know the answer but am willing to explore the dilemma of torture vs speck, and rape vs deprived rapists. Torture is rape is i...
Unfortunately it looks like the lines between them have gotten a little blurry.
I will consider this claim, if you can show my how it is really different.
I have taken considerable care to construct a problem in which we are indeed are dealing with the trading suffering for potentially more suffering. It does not effect me one bit, that the topic has now switched from specks to rape. In fact if "detraction" happens, shouldn't it be the burden of the person who feels detracted to explain it? I merely ask for consistency.
In my mind I choose t...
If you really understood how much torture 3^^^3 dust specks produces...
You make a valid point. I will not deny that you have a strong point. All I ask is that you not deny me of having you remain consistent with your reasoning. I have reposted a thought experiment, please tell me what your answer is:
Omega has given you choice to allow or disallow 10 rapists to rape someone. Why 10 rapists? Omega knows the absolute utility across all humans, and unfortunately as terrible as it sounds, the suffering/torturing of 10 rapists not being able to rape is mor...
That is the crux of the problem. Bob understands just as much as you claim you understand what 3^^^3 is. Yet he chooses the "Dust Holocaust".
First let me assume that you, peter_hurford, are a "Torturer" or rather, you are from the camp that obviously chooses 50 years. I have no doubt in my mind that you bring extremely rational and valid points to this discussions. You are poking holes in Bobs reasoning at its weakest points. This is a good thing.
I whole-heartedly concede that you have compelling points, by poking into holes into...
Are you familiar with prospect theory?
No, but I will surely read up on that now.
You seem to be describing what you (an imperfectly rational agent) would choose, simply using "PVG" to label the stuff that makes you choose what you actually choose, and you end up taking probability into consideration in a way similar to prospect theory.
Absolutely. In fact I can see how a theist will simply say, "it is my PVG to believe in God, therefore It is rational for me to do so."
I do not have a response to that. I will need to learn more before I can work this out in my head. Thank you for the insightful comments.
Having read, Influence, The Prince and, 48 laws of Power I found Cialdini's book the most satisfying to read because it was filled with empirical research. The latter books I mentioned were no doubt excellent reads however anecdotal. Also, Influence is presented in the least "dark arts" ways from the other two. The book is about learning to stay ahead of influence just as much as it is about influencing.
Thank you for your response. I believe I understand you correctly, I made a response to Manfred's comment in which I reference your response as well. Do you believe I interpreted you correctly?
An agent that has an empathetic utility functions will only edit its own code if and only if it maximizes expected utility of the same empathetic utility function. Do I get your drift?
If Bob cares about cute puppies, then Bob will use his monstrous intelligence to bend the energy of the universe towards cute puppies. And love and flowers and sunrises and babies and cake.
I follow you. It does resolve my question of whether or not rationality + power necessarily involves a terrible outcomes. I had asked the question of whether or not a perfect rationalist given enough time and resources would become perfectly selfish. I believe I understand the answer as no.
Matt_Simpson gave a similar answer:
...Suppose a rational agent has the ab
Where would one go to read more about modafinil?
I have read Wikipedia and Erowid.
If you were to assign a percentage of how much all around "better" you feel when you are on it, what would it be? For example 10% better than off? 20%,30%?
I'm a 28-yo male in the SF area previously from NYC.
This site is intimidating and I think there are many more just like me who are intimidated to introduce themselves because they might not feel they are as articulate or smart as some of the people on this forum. There are some posts that are so well written that I couldn't write in a 100 years. There is so much information that it seems overwhelming. I want to stop lurking and invite others to join too. I'm not a scientist and I didn't study AI in college, I just want to meet good people and so do yo...
I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.
... (read more)