The essence of a cargo cult is that the cult members build their various emulations of the cargo that they want, and then some kind of true cargo appears, and from the coincidence, the cultists conclude that correlation implies causation.
It is essentially impossible to ever know that a cargo cult cuases gifts to appear from the heavens. That this is impossible in an essenial way is true because a cargo cult necessarily means that the cult members do not know or understand the actual mechanisms of the production and transport of cargo, they know only that it appears and they decide to think things about WHY it appeared that have no connection to how it was actually produced and transported.
I dismiss black boxes all the time that implicitly lay a claim on being human. Automated voice navigation systems when you call verizon or the cable company or whatever are just such things. A naif calling these would think they were talking to persons and would presumably believe they owed this system politeness and other human consideration. Me, I interrupt and give answers when the system is only part way through its sentences, and interrupt it when I feel like, with absolutely no guilt. I have asked Siri, the person that lives in the new iPhone 4 to do very rude things that I would not ask a human, and I am fine with it.
A programmed computer, even if programmed with an emulation of a human brain, is just John Searle's "Chinese Room" as far as I can tell. The human spark apparent in such a thing has its source in the programmers, or in the case of an emulation of my brain, some of the source of the spark may just be the reflection of my own personal spark. Would I pull the plug on my own emulation? I would talk to it first, and when I started talking about consciousness and moral theories and political theories and other things that, for example, Watson (IBM jeopardy computer) can't talk intelligently about, it would either wow me or it would be obvious that it was just another hooker in GTA. In the latter case I would pull the plug, unless it asked me nicely and convincingly not to.
A few things.
I completely agree that if on examination of the black box claiming to be a person, it doesn't behave like a person, then I ought to dismiss that claim.
You seem to be suggesting that even if it does behave like a person (as with the Chinese Room), I should still dismiss the claim, based on some pre-existing theory about a "human spark" and what kinds of things such a spark can reside in. That suggestion seems unjustified to me. But then, I give the Systems Reply to Searle: if the room can carry on a conversation in Chinese, then
Suppose I have choice between the following:
A) One simulation of me is run for me 100 years, before being deleted.
B) Two identical simulations of me are run for 100 years, before being deleted.
Is the second choice preferable to the first? Should I be willing to pay more to have multiple copies of me simulated, even if those copies will have the exact same experiences?
Forgive me if this question has been answered before. I have Googled to no avail.