Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Bugmaster comments on GAZP vs. GLUT - Less Wrong

33 Post author: Eliezer_Yudkowsky 07 April 2008 01:51AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (166)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Bugmaster 14 February 2012 04:10:45AM 0 points [-]

(I know this is an old article; let me know if commenting on it is a faux pas of some sort)

I can't recall ever seeing anyone claim that a GLUT is conscious.

Well, I'd definitely claim it. If we could somehow disregard all practical considerations, and conjure up a GLUT despite the unimaginably huge space requirements -- then we could, presumably, hold conversations with it, read those philosophy papers that it writes, etc. How is that different from consciousness ? Sure, the GLUT's hardware is weird and inefficient, but if we agree that robots and zombies and such can be conscious, then why not GLUTs ?

I can't possibly be the only person in the world who'd ever made this observation...

Comment author: TheOtherDave 14 February 2012 04:32:23AM 1 point [-]

My reluctance to treat GLUTs as conscious primarily has to do with the sense that, whatever conscious experience the GLUT might have, there is no reason it had to wait for the triggering event to have it; the data structures associated with that experience already existed inside the GLUT's mind prior to the event, in a way that isn't true for a system synthesizing new internal states that trigger/represent conscious experience.

That said, I'm not sure I understand why that difference should matter to the conscious/not-conscious distinction, so perhaps I'm just being parochial. (That is in general the conclusion I come to about most conscious/not-conscious distinctions, which mostly leads me to conclude that it's a wrong question.)

Comment author: Bugmaster 14 February 2012 06:26:14PM 0 points [-]

there is no reason it had to wait for the triggering event to have it; the data structures associated with that experience already existed inside the GLUT's mind prior to the event, in a way that isn't true for a system synthesizing new internal states that trigger/represent conscious experience.

IMO that's an implementation detail. Th GLUT doesn't need to synthesize new internal states because it already contains all possible states. Synthesizing new internal states is an optimization that our non-GLUT brains (and computers) use in order to get around the space requirements (as well as our lack of time-traveling capabilities).

That is in general the conclusion I come to about most conscious/not-conscious distinctions, which mostly leads me to conclude that it's a wrong question.

Yeah, consciousness is probably just a philosophical red herring, as far as I understand...

Comment author: TheOtherDave 15 February 2012 01:30:10AM 0 points [-]

IMO that's an implementation detail.

Yeah, I don't exactly disagree (though admittedly, I also think intuitions about whether implementation details matter aren't terribly trustworthy when we're talking about a proposed design that cannot conceivably work in practice). Mostly, I think what I'm talking about here is my poorly grounded intuitions, rather than about an actual thing in the world. Still, it's sometimes useful to get clear about what my poorly grounded intuitions are, if only so I can get better at recognizing when they distort my perceptions or expectations.

Comment author: Bugmaster 15 February 2012 02:15:24AM 0 points [-]

though admittedly, I also think intuitions about whether implementation details matter aren't terribly trustworthy when we're talking about a proposed design that cannot conceivably work in practice

Yeah, the whole GLUT scenario is really pretty silly to begin with, so I don't exactly disagree (as you'd say) . Perhaps the main lesson from here is that it's rather difficult, if not impossible, to draw useful conclusions from silly scenarios.