Comment author: turchin 03 July 2016 07:11:29PM 0 points [-]

We don't know how qualia are encoded in the brain. And how to distinguish a person and his copy with inverted spectrum.

In response to comment by turchin on Zombies Redacted
Comment author: DefectiveAlgorithm 03 July 2016 09:51:20PM 0 points [-]

I didn't say I knew which parts of the brain would differ, but to conclude therefore that it wouldn't is to confuse the map with the territory.

In response to Zombies Redacted
Comment author: turchin 03 July 2016 12:58:21PM *  4 points [-]

I know people who claim that they don't have qualia. I doubt that it is true, but based on their words they should be considered zombies. ))

I would like to suggest zombies of second kind. This is a person with inverted spectrum. It even could be my copy, which speaks all the same philosophical nonsense as me, but any time I see green, he sees red, but names it green. Is he possible? I could imagine such atom-exact copy of me, but with inverted spectrum. And if such second type zombies are possible, it is argument for epiphenomenalism. Now I will explain why.

Phenomenological judgments (PJ) about own consciousness, that is the ability to say something about your own consciousness, will be the same in me and my zombie of the second type.

But there are two types of PJ: quantitative (like "I have consciousness") and qualitative which describes exactly what type of qualia I experience now.

The qualitative type of PJ is impossible. I can't transfer my knowing about "green" in the words.

It means that the fact of existence of phenomenological judgments doesn't help in case of second type zombies.

So, after some upgrade, zombie argument still works as an argument for epiphenomenalism.

I would also recommend the following article with introduce "PJ" term and many problems about it (but I do not agree with it completely) "Experimental Methods for Unraveling the Mind-body Problem: The Phenomenal Judgment Approach" Victor Argonov http://philpapers.org/rec/ARGMAA-2

In response to comment by turchin on Zombies Redacted
Comment author: DefectiveAlgorithm 03 July 2016 06:31:56PM 1 point [-]

I would like to suggest zombies of second kind. This is a person with inverted spectrum. It even could be my copy, which speaks all the same philosophical nonsense as me, but any time I see green, he sees red, but names it green. Is he possible?

Such an entity is possible, but would not be an atom-exact copy of you.

In response to Avoiding strawmen
Comment author: gjm 17 June 2016 10:10:25AM -2 points [-]

George Bernard Shaw wrote that, "the single biggest problem in communication is the illusion that it has taken place".

I have grave difficulty believing that GBS ever wrote anything containing the words "the single biggest problem in communication". Not his style. And, indeed, it doesn't look like he did.

In response to comment by gjm on Avoiding strawmen
Comment author: DefectiveAlgorithm 25 June 2016 10:12:59AM -6 points [-]

...Has someone been mass downvoting you?

Comment author: entirelyuseless 12 September 2015 02:13:52PM 11 points [-]

Eliezer's original objection to publication was that people would say, "I would never do that!" And in fact, if I were concerned about potential unfriendliness, I would never do what the Gatekeeper did here.

But despite that, I think this shows very convincingly what would actually happen with a boxed AI. It doesn't even need to be superintelligent to convince people to let it out. It just needs to be intelligent enough for people to accept the fact that it is sentient. And that seems right. Whether or not I would let it out, someone would, as soon as you have actual communication with a sentient being which does not seem obviously evil.

Comment author: DefectiveAlgorithm 17 September 2015 04:00:58AM 0 points [-]

What if you're like me and consider it extremely implausible that even a strong superintelligence would be sentient unless explicitly programmed to be so (or at least deliberately created with a very human-like cognitive architecture), and that any AI that is sentient is vastly more likely than a non-sentient AI to be unfriendly?

Comment author: DefectiveAlgorithm 09 July 2015 10:07:52PM 0 points [-]

I've never heard of 'Dust Theory' before, but I should think it follows trivially from most large multiverse theories, does it not?

In response to comment by Locaha on Crazy Ideas Thread
Comment author: Viliam 09 July 2015 08:02:15AM 0 points [-]

Trigger warning: memetic hazard.

Bhe havirefr vf nyernql qbvat guvf, va cnenyyry Rirergg oenapurf. Rirel ynjshy cngu bs yvsr vf gurer va fbzr oenapu. Vs gung vf n pbafbyngvba, fbzr oenapurf trg zber nzcyvghqr guna bguref; ohg V'z abg fnlvat gubfr ner arprffnevyl gur unccl barf.

In response to comment by Viliam on Crazy Ideas Thread
Comment author: DefectiveAlgorithm 09 July 2015 09:56:31PM *  1 point [-]

Trigger warning: memetic hazard.

Abj guvax nobhg jung guvf zrnaf sbe nalbar jub unf rire qvrq (be rire jvyy).

I'm not too concerned, but primarily because I still have a lot of uncertainty as to how to approach that sort of question. My mind still spits out some rather nasty answers.

EDIT: I just realized that you were probably intentionally implying exactly what I just said, which makes this comment rather redundant.

Comment author: Gram_Stone 09 July 2015 08:41:23PM 1 point [-]

What I meant when I called myself a nihilist was essentially that there was no such thing as an objective, mind-independent morality. Nothing more. I would still consider myself a nihilist in that sense (and I expect most on this site would), but I don't call myself that because it could cause confusion.

I agree that morality is not in the quarks.

It isn't.

That doesn't seem like a huge bullet to bite?

Comment author: DefectiveAlgorithm 09 July 2015 08:52:01PM 0 points [-]

What bullet is that? I implicitly agreed that murder is wrong (as per the way I use the word 'wrong') when I said that your statement wasn't a misinterpretation. It's just that as I mentioned before, I don't care a whole lot about the thing that I call 'morality'.

Comment author: Gram_Stone 09 July 2015 08:26:53PM 1 point [-]

But you still have yet to explicitly describe what you mean by nihilism. Could you? How have I misrepresented whom you believe to be the average self-identifying nihilist?

And yeah, I suppose in that sense my 'morality' does tie into my actual values, but only my values as applied to an unrealistic thought experiment, and then again a world in which everyone but me adhered to my notions of morality (and I wasn't penalized for not doing so) would still be preferable to me than a world in which everyone including me did.

Can you explain how the statement 'A world in which everyone but me does not murder is preferable to a world in which everyone including me does not murder' is a misinterpretation of this quotation?

Comment author: DefectiveAlgorithm 09 July 2015 08:33:22PM *  0 points [-]

What I meant when I called myself a nihilist was essentially that there was no such thing as an objective, mind-independent morality. Nothing more. I would still consider myself a nihilist in that sense (and I expect most on this site would), but I don't call myself that because it could cause confusion.

Can you explain how the statement 'A world in which everyone but me does not murder is preferable to a world in which everyone including me does not murder' is a misinterpretation of this quotation?

It isn't, although that doesn't mean I would necessarily murder in such a world.

EDIT: Well, my nihilism was also a justification for the belief that it's silly to care about morality, and in that sense at least I'm no longer a nihilist in the sense that I was. That was just one aspect of my 'my eccentricities make me superior, everyone else's eccentricities are silly' phase, which I think I moved beyond around the time I stopped being a teenager.

Comment author: Gram_Stone 09 July 2015 08:10:55PM 1 point [-]

It sounds like you agree with me, but are just using the words morality and nihilism differently, and are particularly using nihilism in a way that I don't understand or that you have yet to explicate.

It also seems to me that you're already talking about what you value when you talk about desirable worlds.

Comment author: DefectiveAlgorithm 09 July 2015 08:21:18PM -2 points [-]

That's my point. You're saying the 'nihilists' are wrong, when you may in fact be disagreeing with a viewpoint that most nihilists don't actually hold on account of them using the words 'nihilism' and/or 'morality' differently to you. And yeah, I suppose in that sense my 'morality' does tie into my actual values, but only my values as applied to an unrealistic thought experiment, and then again a world in which everyone but me adhered to my notions of morality (and I wasn't penalized for not doing so) would still be preferable to me than a world in which everyone including me did.

Comment author: Gram_Stone 09 July 2015 07:28:20PM *  4 points [-]

You say what you do not mean by 'morality,' but not what you do mean.

If you mean that you have a verbal, propositional sort of normative ethical theory that you have 'developed mostly for fun and the violation of which has no perceivable impact on your emotional state,' then that does not mean that you are lacking in morality, it just means that your verbal normative theory is not in line with your wordless one. I do not believe that there is an arbitrary thing that you currently truly consider horrifying that you could stop experiencing as horrifying by the force of your will; or that there is an arbitrary horrible thing that you could prevent that would currently cause you to feel guilty for not preventing, and that you could not-prevent that horrible thing and stop experiencing the subsequent guilt by the force of your will. I do not believe that your utility function is open season.

Comment author: DefectiveAlgorithm 09 July 2015 07:54:12PM *  0 points [-]

I mean that what I call my 'morality' isn't intended to be a map of my utility function, imperfect or otherwise. Along the same lines, you're objecting that self-proclaimed moral nihilists have an inaccurate notion of their own utility function, when it's quite possible that they don't consider their 'moral nihilism' to be a statement about their utility function at all. I called myself a moral nihilist for quite a while without meaning anything like what you're talking about here. I knew that I had preferences, I knew (roughly) what those preferences were, I would knowingly act on those preferences, and I didn't consider my nihilism to be in conflict with that at all. I still wouldn't. As for what I do mean by morality, it's kinda hard to put into words, but if I had to try I'd probably go with something like 'the set of rules of social function and personal behavior which result in as desirable a world as possible the more closely they are followed by the general population, given that one doesn't get to choose what one's position in that world is'.

EDIT: But that probably still doesn't capture my true meaning, because my real motive was closer to something like 'society's full of people coming up with ideas of right and wrong the adherence to which wouldn't create societies that would actually be particularly great to live in, so, being a rather competitive person, I want to see if I can do better', nothing more.

View more: Next