In this essay I argue the following:
Brain emulation requires enormous computing power; enormous computing power requires further progression of Moore’s law; further Moore’s law relies on large-scale production of cheap processors in ever more-advanced chip fabs; cutting-edge chip fabs are both expensive and vulnerable to state actors (but not non-state actors such as terrorists). Therefore: the advent of brain emulation can be delayed by global regulation of chip fabs.
Full essay: http://www.gwern.net/Slowing%20Moore%27s%20Law
There's a strong feeling in the culture here that it's virtuous to be able to discuss weird and scary ideas without feeling weirded out or scared. See: torture and dust specks, AI risk, uploading, and so on.
Personally, I agree with you now about this article, because I can see that you and the fellow above and probably others feel strongly about it. But when I read it originally, it never occurred to me to feel creeped out, because I've made myself to just think calmly about ideas, at least until they turn into realities -- I think many other readers here are the same. Since I don't feel it automatically, quantifying "how weird" or "how scary" these things are to other people takes a real conscious effort; I forget to do it and I'm not good at it either.
So that's how it happens.
I like entertaining ideas that others find weird and scary, too, and I don't mind that they're "weird". I have nothing against it. Even though my initial reaction was "Does this guy support terrorism?" I was calm enough to investigate and discover that no, he does not support terrorism.
Yeah, I relate to this. Not on this pa... (read more)