Negative Karma bring it on... a lot of this item are based upon that society. and we can't base that its the same as our's. you have to put you personal moral's aside.
THE PEOPLE
why the people are horrified of pedophile : fear of there child's saftey : were molested themself's : morals against : Him just having thoughts of it. Why its abnormality : whose to say it's not a normality : what if it was the majority trait till everyone had it removed
This make's me fear its an overly controlled society giving up there personal rights.
PEDOPHILE
Will he act? will he not act? Is he physically capable of the acting? He may not have his male part's. he could be a a vegetable.
Comes down to is it control able on own. Is there a breaking point for monitoring.
Can someone else monitor him? What are the boundaries of monitoring? dose the society self govern as citizen law enforces. citizen's arrest. Are There other technologies that could be used. Stun gun : gps tracker
ENVIRONMENT THAT WOULD ALLOW SITUATION
Is it the child's fault for putting him self in position to be molested? Is the parents fault for not watching there kid? Is the child a Child perpetrator and molested the pedophile? Can it be the pedophilia's fault if he can't control it? Is it societies fault for not better understanding?
I like to add uncommon possibilities.
There's a recent science fiction story that I can't recall the name of, in which the narrator is traveling somewhere via plane, and the security check includes a brain scan for deviance. The narrator is a pedophile. Everyone who sees the results of the scan is horrified--not that he's a pedophile, but that his particular brain abnormality is easily fixed, so that means he's chosen to remain a pedophile. He's closely monitored, so he'll never be able to act on those desires, but he keeps them anyway, because that's part of who he is.
What would you do in his place?
In the language of good old-fashioned AI, his pedophilia is a goal or a terminal value. "Fixing" him means changing or erasing that value. People here sometimes say that a rational agent should never change its terminal values. (If one goal is unobtainable, the agent will simply not pursue that goal.) Why, then, can we imagine the man being tempted to do so? Would it be a failure of rationality?
If the answer is that one terminal value can rationally set a goal to change another terminal value, then either