Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: summerstay 12 August 2015 03:27:16PM -1 points [-]

Under theories like loop quantum gravity, doesn't some "fabric of spacetime" exist? I would call that a refinement of the idea of the ether. It has odd properties in order to allow relativity, but it hasn't been ruled out.

Comment author: summerstay 07 April 2015 06:18:47PM 0 points [-]

This is Hari's business. She takes innocuous ingredients and makes you afraid of them by pulling them out of context.... Hari's rule? "If a third grader can't pronounce it, don't eat it." My rule? Don't base your diet on the pronunciation skills of an eight-year-old.

From http://gawker.com/the-food-babe-blogger-is-full-of-shit-1694902226

Comment author: Thomas 27 October 2014 09:59:57AM 4 points [-]

Where are you right, while most others are wrong? Including people on LW!

Comment author: summerstay 27 October 2014 01:52:51PM *  3 points [-]

It would be a lot harder to make a machine that actually is conscious (phenomenally conscious, meaning it has qualia) than it would be to make one that just acts as if is conscious (in that sense). It is my impression that most LW commenters think any future machine that acts conscious probably is conscious.

Comment author: Salemicus 16 October 2014 06:45:57PM 11 points [-]

I think this was a great idea for a post. If LessWrong rationality is worthwhile, then it ought to get lots of replies on concrete facts - not moral preferences, theology, or other unproveables.

I used to believe that embryos pass through periods of development representing earlier evolutionary stages - that there was a period when a human baby was basically a fish, then later an amphibian, and so on. I believed this because my father told me so; he was a doctor (though not an obstetrician), and the information he had given me about other subjects was highly reliable. Most knowledge is second hand - it was highly rational for me to believe him. I now know (also second hand!) that Haeckel's ideas were debunked a long time ago - although they might well have been in a textbook when my father was at medical school.

To me, the lesson is trust, but verify.

Comment author: summerstay 19 October 2014 11:19:31AM 2 points [-]

I only recently realized that evolution works, for the most part, by changing the processes of embryonic development. There are some exceptions-- things like neoteny and metamorphosis-- but most changes are genetic differences leading to differences in, say, how long a process of growth is allowed to occur in the embryo.

Comment author: Pablo_Stafforini 24 September 2014 04:35:22AM *  4 points [-]

David Chalmers' The Conscious Mind is excellent, and no, you don't have to agree with its conclusions to agree with that characterization. If you lack the time to read an entire book, try Consciousness and its place in nature instead.

Few philosophers are worth reading; Chalmers is definitely one of them.

Comment author: summerstay 24 September 2014 04:22:48PM 2 points [-]

There's a reason everyone started calling it "the hard problem." Chalmers explained the problem so clearly that we now basically just point and say "that thing Chalmers was talking about."

Comment author: shminux 08 September 2014 07:35:17PM *  10 points [-]

"WWRMD?" (RM for "rational me".)

Comment author: summerstay 18 September 2014 05:09:22PM 6 points [-]

This is exactly the point of asking "What Would Jesus Do?" Christians are asking themselves, what would a perfectly moral, all-knowing person do in this situation, and using the machinery their brains have for simulating a person to find out the answer, instead of using the general purpose reasoner that is so easily overworked. Of course, simulating a person (especially a god) accurately can be kind of tricky. Similar thoughts religious people use to get themselves to do things that they want to abstractly but are hard in the moment: What would I do if I were the kind of person I want to become? What would a perfectly moral, all-knowing person think about what I'm about to do?

Comment author: DanielVarga 25 May 2014 02:07:08PM 3 points [-]

Wow, I'd love to see some piece of art depicting that pink worm vine.

Comment author: summerstay 01 June 2014 08:37:14PM 1 point [-]

I assumed that was the intention of the writers of Donnie Darko. The actual shapes coming out of their chests we got were not right, but you could see this is what they were trying to do.

Comment author: asr 17 December 2013 06:57:08AM 1 point [-]

Yes. I picked the ethical formulation as a way to make clear that this isn't just a terminological problem.

I like the framing in terms of expectation.

And I agree that this line of thought makes me skeptical about the computationalist theory of mind. The conventional formulations of computation seem to abstract away enough stuff about identity that you just can't hang a theory of mind and future expectation on what's left.

Comment author: summerstay 17 December 2013 03:09:01PM 0 points [-]

I think that arguments like this are a good reason to doubt computationalism. That means accepting that two systems performing the same computations can have different experiences, even though they behave in exactly the same way. But we already should have suspected this: it's just like the inverted spectrum problem, where you and I both call the same flower "red," but the subjective experience I have is what you would call "green" if you had it. We know that most computations even in our brains are not accompanied by conscious perceptual experience, so it shouldn't be surprising if we can make a system that does whatever we want, but does it unconsciously.

Comment author: asr 17 December 2013 03:11:28AM 0 points [-]

The reference is a good one -- thanks! But I don't quite understand the rest of your comments. Can you rephrase more clearly?

Comment author: summerstay 17 December 2013 02:58:18PM 1 point [-]

Sorry, I was just trying to paraphrase the paper in one sentence. The point of the paper is that there is something wrong with computationalism. It attempts to prove that two systems with the same sequence of computational states must have different conscious experiences. It does this by taking a robot brain that calculates the same way as a conscious human brain, and transforms it, always using computationally equivalent steps, to a system that is computationally equivalent to a digital clock. This means that either we accept that a clock is at every moment experiencing everything that can be experienced, or that something is wrong with computationalism. If we take the second option, it means that two systems with the exact same behavior and computational structure can have different perceptual consciousness.

Comment author: ahbwramc 13 December 2013 10:50:09PM 4 points [-]

Could the relevant moral change happen going from B to C, perhaps? i.e. maybe a mind needs to actually be physically/causally computed in order to experience things. Then the torture would have occurred whenever John's mind was first simulated, but not for subsequent "replays," where you're just reloading data.

Comment author: summerstay 16 December 2013 04:12:29PM 2 points [-]

Check out "Counterfactuals Can't Count" for a response to this. Basically, if a recording is different in what it experiences than running a computation, then two computations that calculate the same thing in the same way, but one has bits of code that never run, experience things differently.

View more: Next