This is Hari's business. She takes innocuous ingredients and makes you afraid of them by pulling them out of context.... Hari's rule? "If a third grader can't pronounce it, don't eat it." My rule? Don't base your diet on the pronunciation skills of an eight-year-old.
From http://gawker.com/the-food-babe-blogger-is-full-of-shit-1694902226
It would be a lot harder to make a machine that actually is conscious (phenomenally conscious, meaning it has qualia) than it would be to make one that just acts as if is conscious (in that sense). It is my impression that most LW commenters think any future machine that acts conscious probably is conscious.
I only recently realized that evolution works, for the most part, by changing the processes of embryonic development. There are some exceptions-- things like neoteny and metamorphosis-- but most changes are genetic differences leading to differences in, say, how long a process of growth is allowed to occur in the embryo.
There's a reason everyone started calling it "the hard problem." Chalmers explained the problem so clearly that we now basically just point and say "that thing Chalmers was talking about."
This is exactly the point of asking "What Would Jesus Do?" Christians are asking themselves, what would a perfectly moral, all-knowing person do in this situation, and using the machinery their brains have for simulating a person to find out the answer, instead of using the general purpose reasoner that is so easily overworked. Of course, simulating a person (especially a god) accurately can be kind of tricky. Similar thoughts religious people use to get themselves to do things that they want to abstractly but are hard in the moment: What would I do if I were the kind of person I want to become? What would a perfectly moral, all-knowing person think about what I'm about to do?
I assumed that was the intention of the writers of Donnie Darko. The actual shapes coming out of their chests we got were not right, but you could see this is what they were trying to do.
I think that arguments like this are a good reason to doubt computationalism. That means accepting that two systems performing the same computations can have different experiences, even though they behave in exactly the same way. But we already should have suspected this: it's just like the inverted spectrum problem, where you and I both call the same flower "red," but the subjective experience I have is what you would call "green" if you had it. We know that most computations even in our brains are not accompanied by conscious perceptual experience, so it shouldn't be surprising if we can make a system that does whatever we want, but does it unconsciously.
Sorry, I was just trying to paraphrase the paper in one sentence. The point of the paper is that there is something wrong with computationalism. It attempts to prove that two systems with the same sequence of computational states must have different conscious experiences. It does this by taking a robot brain that calculates the same way as a conscious human brain, and transforms it, always using computationally equivalent steps, to a system that is computationally equivalent to a digital clock. This means that either we accept that a clock is at every moment experiencing everything that can be experienced, or that something is wrong with computationalism. If we take the second option, it means that two systems with the exact same behavior and computational structure can have different perceptual consciousness.
Check out "Counterfactuals Can't Count" for a response to this. Basically, if a recording is different in what it experiences than running a computation, then two computations that calculate the same thing in the same way, but one has bits of code that never run, experience things differently.
Under theories like loop quantum gravity, doesn't some "fabric of spacetime" exist? I would call that a refinement of the idea of the ether. It has odd properties in order to allow relativity, but it hasn't been ruled out.