Utilitarianism isn't a description of human moral processing, it's a proposal for how to improve it.
That's not necessarily false, but it's a dangerous thing to say to yourself. Mostly when I find myself thinking it, I've just wasted a great deal of time, and I'm trying to convince myself that it wasn't really wasted. It's easy to tell myself, hard to verify, and more pleasant than thinking my time-investment was for nothing.
This is transformative. Thank you.
Either both are true, or neither.
Anyone smart enough to be dangerous is smart enough to be safe? I'm skeptical- folksy wisdom tells me that being smart doesn't protect you from being stupid.
But in general, yes- the threat becomes more and more tangible as the barrier to AI gets lower and the number of players increases. At the moment, it seems pretty intangible, but I haven't actually gone out and counted dangerously smart AI researchers- I might be surprised by how many there are.
To be clear, I was NOT trying to imply that we should actually right now form the Turing Police.
Edited, in the interest of caution.
However, this is exactly the issue I'm trying to discuss. It looks as though, if we take the threat of uncaring AI seriously, this is a real problem and it demands a real solution. The only solution that I can see is morally abhorrent, and I'm trying to open a discussion looking for a better one. Any suggestions on how to do this would be appreciated.
If we accept that what someone 'wants' can be distinct from their behaviour, then "what do I want?" and "what will I do?" are two different questions (unless you're perfectly rational). Presumably, a FAI scanning a brain could answer either question.
The question of which is kind of still there, though. Procrastination is lazy, but getting drunk at work is irresponsible.
Agreed. Squicky dilemmas designed to showcase utilitarianism are not generally found in real life (as far as I know). And a human probably couldn't be trusted to make a sound judgement call even if one were found. Running on untrusted hardware and such.
Ah- and this is the point of the quote. Oh, I like that.