But suppose that approach was not available to you - what methods would you implement to distinguish between pornography and eroticism, and ban one but not the other? Sufficiently clear that a scriptwriter would know exactly what they need to cut or add to a movie in order to move it from one category to the other? What if the nude "Pirates of of Penzance" was at a Pussycat Theatre and "Fuck Slaves of the Caribbean XIV" was at the Met?
Not saying that I would endorse this as a regulatory policy, but it's my understanding that the strategy used by e.g. the Chinese government is to not give any explicit guidelines. Rather, they ban things which they consider to be out of line and penalize the people who produced/distributed them, but only give a rough reason. The result is that nobody tries to pull tricks like obeying the letter of the regulations while avoiding the spirit of them. Quite the opposite, since nobody knows what exactly is safe, people end up playing it as safe as possible and avoiding anything that the censors might consider a provocation.
Of course this errs on the side of being too restrictive, which is a problem if the eroticism is actually somet...
Short answer: Mu.
Longer answer: "Porn" is clearly underspecified, and to make matters worse there's no single person or interest group that we can try to please with our solution: many different groups (religious traditionalists, radical feminists, /r/nofap...) dislike it for different and often conflicting reasons. This wouldn't be such a problem -- it's probably possible to come up with a definition broad enough to satisfy all parties' appetites for social control, distasteful as such a thing is to me -- except that we're also trying to leave "eroticism" alone. Given that additional constraint, we can't possibly satisfy everyone; the conflicting parties' decision boundaries differ too much.
We could then come up with some kind of quantification scheme -- show questionable media to a sample of the various stakeholders, for example -- and try to satisfy as many people as possible. That's probably the least-bad way of solving the problem as stated, and we can make it as finely grained as we have money for. It's also one that's actually implemented in practice -- the MPAA ratings board works more or less like this. Note however that it still pisses a lot of pe...
distinguish between pornography and eroticism
Aren't you assuming these two are at different sides of a "reality joint"?
I tend to treat these words as more or less synonyms in that they refer to the same thing but express different attitude on the part of the speaker.
To construct a friendly AI, you need to be able to make vague concepts crystal clear, cutting reality at the joints when those joints are obscure and fractal - and them implement a system that implements that cut.
I don't think that this is true. Reductionist solutions to philosophical problems typically pick some new concepts which can be crisply defined, and then rephrase the problem in terms of those, throwing out the old fuzzy concepts in the process. What they don't do is to take the fuzzy concepts and try to rework them.
For example, nowhere in the ...
To construct a friendly AI, you need to be able to make vague concepts crystal clear, cutting reality at the joints when those joints are obscure and fractal - and them implement a system that implements that cut.
Strongly disagree. The whole point of Bayesian reasoning is that it allows us to deal with uncertainty. And one huge source of uncertainty is that we don't have precise understandings of the concepts we use. When we first learn a new concept, we have a ton of uncertainty about its location in thingspace. As we collect more data (either thro...
This is a sorites problem and you want to sort some pebbles into kinds. You've made clear that externalism about porn may be true (something may begin or stop being porn in virtue of properties outside it's own inherent content, such as where it is and contextual features).
It seems to me that you have to prioritize your goals in this case. So the goal "ban porn" is much more important then the goal "leave eroticism alone". My response would be to play safe and ban all footage including genitals, similar to what the Japanese already do...
The distinction between eroticism and pornography is that it's porn of a typical viewer wanks to it. Like the question of whether something is art, the property is not intrinsic to the thing itself.
That this question was so easy, very slightly decreases my difficulty estimate for Friendliness.
I'll cite the comment section on this post to friends whenever I need to say: "And this, my friends, is why you don't let rationalists discuss porn" http://imgs.xkcd.com/comics/shopping_teams.png
In terms of AI, this is equivalent with "value loading": refining the AI's values through interactions with human decision makers, who answer questions about edge cases and examples and serve as "learned judges" for the AI's concepts. But suppose that approach was not available to you
But it is, and the contrary approach of teaching humans to recognize things doesn't have an obvious relation to FAI, unless we think that the details of teaching human brains by instruction and example are relevant to how you'd set up a similar training ...
An alternative to trying to distinguish between porn and erotica on the basis of content or user attitudes: teach the AI to detect infrastructures of privacy and subterfuge, and to detect when people are willing to publicly patronize and self-identify with something. Most people don't want others to know that they enjoy porn. You could tell your boss about the nude Pirates you saw last weekend, but probably not the porn. Nude Pirates shows up on the Facebook page, but not so much the porn. An online video with naked people that has half a million views, but is discussed nowhere where one's identity is transparent, is probably porn. It's basic to porn that it's enjoyed privately, erotica publicly.
I would guess that eroticism is supposed to inspire creativity while pronography supposedly replaces it. So if the piece in question were to be presented to people while their brain activity is being monitored I would expect to see an increase of activity throughout the brain for eroticism while I'd expect a decrease or concentration of activity for pornography. Although I have no idea if that is actually the case.
Without reference to sexual stimulation this would include a lot of things that are not currently thought of as pornography, but that might actually be intentional depending on the reason why someone would want to ban pornography.
I think there is no fundamental difference between porn and erotica, it's just that one is low status and the other is high status (and what's perceived as highs status depends greatly on the general social milieu so it's hard to give any kind of stand-alone definition to delineate the two). It only seems like there are two "clusters in thingspace" because people tend to optimize their erotic productions to either maximize arousal or maximize status, without much in between (unless there is censorship involved, in which case you might get shows that are optimized to just barely pass censorship). Unfortunately I don't think this answer helps much with building FAI.
A couple of thoughts here:
Set a high minimum price for anything arousing (say $1000 a ticket). If it survives in the market at that price, it is erotica; if it doesn't, it was porn. This also works for $1000 paintings and sculptures (erotica) compared to $1 magazines (porn).
Ban anything that is highly arousing for males but not generally liked by females. Variants on this: require an all-female board of censors; or invite established couples to view items together, and then question them separately (if they both liked it, it's erotica). Train the AI on examples until it can classify independently of the board or couples.
Just criminalize porn, and leave it to the jury to decide whether or not it's porn. That's how we handle most moral ambiguities, isn't it?
I will assume that the majority of the population shares my definition of porn and is on board with this, creating low risk of an activist jury (otherwise this turns into the harder problem of "how to seize power from the people".)
Edit: On more careful reading, I guess that's not allowed since it would fall in the "I know it when I see it" category. But then, since we obviously are not going to write ...
To construct a friendly AI, you need to be able to make vague concepts crystal clear, cutting reality at the joints when those joints are obscure and fractal - and them implement a system that implements that cut.
I don't think that's how the solution to FAI will look like. I think the solution to FAI will look like "Look, this a human (or maybe an uploaded human brain), it is an agent, it has a utility function. You should be maximizing that."
After refining my thoughts, I think I see the problem:
1: The Banner AI must ban all transmissions of naughty Material X.
1a: Presumably, the Banner must also ban all transmissions of encrypted naughty Material X.
2: The people the Banner AI is trying to ban from sending naughty transmissions have an entire field of thought (knowledge of human values) the AI is not allowed to take into account: It is secret.
3: Presumably, the Banner AI has to allow some transmissions. It can't just shut down all communications.
Edit: 4: The Banner AI needs a perfect success ...
what methods would you implement to distinguish between pornography and eroticism, and ban one but not the other
There's a heuristic I use to distinguish between the two that works fairly well: in erotica, the participants are the focus of the scene. In pornography, the camera (and by implication the viewer) is the true focus of the scene.
That being said, I have a suspicion that trying to define the difference explicitly is a wrong question. People seem to use a form of fuzzy logic[1] when thinking about the two. What we're really looking at is gradatio...
Well, there's Umberto Eco's famous essay on the subject. (The essay is not long so read the whole thing.)
One notable thing about his criterion is that it makes no reference to nudity, thus it's a horrendous predictor on the set of all possible movies, it just happens to work well on the subset of possible movies a human would actually want to watch.
Suppose you're put in change of some government and/or legal system, and you need to ban pornography, and see that the ban is implemented. Pornography is the problem, not eroticism. So a lonely lower-class guy wanking off to "Fuck Slaves of the Caribbean XIV" in a Pussycat Theatre is completely off. But a middle-class couple experiencing a delicious frisson when they see a nude version of "Pirates of Penzance" at the Met is perfectly fine - commendable, even.
I have no idea what distinction you're trying to draw here. And I say this ...
To construct a friendly AI, you need to be able to make vague concepts crystal clear, cutting reality at the joints when those joints are obscure and fractal - and them implement a system that implements that cut.
There are lots of suggestions on how to do this, and a lot of work in the area. But having been over the same turf again and again, it's possible we've got a bit stuck in a rut. So to generate new suggestions, I'm proposing that we look at a vaguely analogous but distinctly different question: how would you ban porn?
Suppose you're put in change of some government and/or legal system, and you need to ban pornography, and see that the ban is implemented. Pornography is the problem, not eroticism. So a lonely lower-class guy wanking off to "Fuck Slaves of the Caribbean XIV" in a Pussycat Theatre is completely off. But a middle-class couple experiencing a delicious frisson when they see a nude version of "Pirates of Penzance" at the Met is perfectly fine - commendable, even.
The distinction between the two case is certainly not easy to spell out, and many are reduced to saying the equivalent of "I know it when I see it" when defining pornography. In terms of AI, this is equivalent with "value loading": refining the AI's values through interactions with human decision makers, who answer questions about edge cases and examples and serve as "learned judges" for the AI's concepts. But suppose that approach was not available to you - what methods would you implement to distinguish between pornography and eroticism, and ban one but not the other? Sufficiently clear that a scriptwriter would know exactly what they need to cut or add to a movie in order to move it from one category to the other? What if the nude "Pirates of of Penzance" was at a Pussycat Theatre and "Fuck Slaves of the Caribbean XIV" was at the Met?
To get maximal creativity, it's best to ignore the ultimate aim of the exercise (to find inspirations for methods that could be adapted to AI) and just focus on the problem itself. Is it even possible to get a reasonable solution to this question - a question much simpler than designing a FAI?