Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: metastable 21 August 2013 02:56:02PM 4 points [-]

A demonstration of the gray fallacy. The opinions of Ariel Castro are not equidistant from the truth with those of the rest of society, and we don't find the truth by finding a middle ground between his claims and those of everybody else.

Comment author: RomanDavis 21 August 2013 06:45:06PM *  4 points [-]

I don't know how this happened. My comment was supposed to be a reply to:

When the axe came into the woods, many of the trees said, "At least the handle is one of us.

Comment author: snafoo 04 August 2013 05:46:45PM 27 points [-]

Some say imprisoning three women in my home for a decade makes me a monster, I say it doesn’t, and of course the truth is somewhere in the middle.

Ariel Castro (according to The Onion)

Comment author: RomanDavis 21 August 2013 02:14:20PM 1 point [-]

Is this just supposed to be a demonstration of irrationality? Can some one unpack this?

Comment author: eurg 18 August 2013 03:34:33PM 1 point [-]

The asking for forgiveness may indicate that people somehow thought of the act as killing, but that did not change their actions. Humans have had a distinctive influence on the local megafauna wherever they showed up. A cynic might write that "humans did not really care about the well-being of ...". We for instance also have taboos of eating dogs and cats, but the last time I checked it was not because of value their lives, but because they are cute. It's mostly organized lying to feel OK.

Comment author: RomanDavis 19 August 2013 03:20:46PM 5 points [-]

What? Of course people care about the lives of dogs and cats.

Anecdotal Evidence: All the people I've seen cry over the death of a dog. Not just children, either. I've seen grown men and women grieve for months over the death of a beloved dog.

Even if their sole reason for caring is that their cute, that wouldn't invalidate the fact that they care. There's some amount of "organized lying" in most social interactions, that doesn't imply that people don't care about anything. That's silliness, or puts such a high burden of proof/ high standard of caring (even when most humans can talk about degrees of caring more or less) as to be both outside the realm of what normal people talk about and totally unfalsifiable.

Comment author: Tuxedage 21 January 2013 04:30:55AM *  7 points [-]

<accolade> yeah

<accolade> I think for a superintelligence it would be a piece of cake to hack a human

<accolade> although I guess I'm Cpt. Obvious for saying that here :)

<Tuxedage> accolade, I actually have no idea what the consensus is, now that the experiment was won by EY

<Tuxedage> We should do a poll or something

<accolade> absolutely. I'm surprised that hasn't been done yet

Poll: Do you think a superintelligent AGI could escape an AI-Box, given that the gatekeepers are highly trained in resisting the AI's persuasive tactics, and that the guards are competent and organized?

Submitting...

Comment author: RomanDavis 21 January 2013 02:37:58PM 1 point [-]

Under the circumstances of the test (Hours to work and they can't just ignore you) then yes, captain obvious. Without that, though? Much less sure.

And the way Eliezer seems to have put it sometimes, where one glance at a line of text will change your mind? Get real. Might as well try to put the whole world in a bottle.

Comment author: MugaSofer 08 January 2013 03:11:27PM 0 points [-]

Why on earth was this downvoted?

Comment author: RomanDavis 17 January 2013 03:38:41AM *  1 point [-]

Well, I still think it made a valid point about being careful about engineering humans and other optimizing.

What I said could be easily boiled down to "What's so great about programming?" To which one could easily reply, "What's so great about running from tigers?"

The point is that programming really is an awesome intellectual activity that could help the human race survive so we might want to maximize the sensuousness of that, but if someone just wants to code that's just as useless as wanting to run from tigers (If that, say lead you to find and taunt tigers.) or having a huge amount of sensuousness involved in running (If running doesn't help survival much.). Ideally, you want to engineer human minds so that they can focus with their full minds on their own terminal goals, which is a super hard problem.

But no one here seems to like it when I put things the way I did in the post. It may be a mental hygiene thing, trying to avoid the illusion of transparency. It may be that the tone is slightly antagonistic, although only in good fun. And it might be a dislike some members have for memes.

I stand by the post, but there's also some fuzzy thinking/ logical rudeness going on as sensuousness isn't the same thing as enjoyment.

Comment author: Eliezer_Yudkowsky 10 October 2012 05:56:43AM 5 points [-]

Koan 3:

Does the idea that everything is made of causes and effects meaningfully constrain experience? Can you coherently say how reality might look, if our universe did not have the kind of structure that appears in a causal model?

Comment author: RomanDavis 12 October 2012 10:14:29AM 1 point [-]

I would expect not exist in a way that suggests causality, e.i. being born and then expecting death, rather than the other way around. This is hard for me to imagine because I didn't really evolve for that world. It's possible that our universe doesn't work that way at the smallest level, but it seems might suspicious that random events lead to a largest world that operates very deterministically. Still, it is possible that this is just the manifestation of probabilistic laws at the smallest level. It's definitely paying rent so far,(for those who do the experiments) so that's we're going with, and there hasn't been a good argument or experiment against it yet.

Infintesmal "violations" of causal laws as manifestations of probabilistic laws don't seem to effect me very much. Large ones that would pay rent haven't happened on the level that would pay rent on an evolutionary or personal level, and, as I understand it (which is not terribly well) these probably won't happen unless the universe ran from the big bang to heat death a couple hundred times.

I can make models in my head where the universe (on my scale) is really chaotic, but looks deterministic because of a conspiracy by matrix gods or whatever, but that seems to violate Occam's Razor, for what that's worth when matrix gods control your life.

Comment author: Eliezer_Yudkowsky 10 October 2012 05:50:38AM 2 points [-]

Koan 2:

"Does your rule there forbid epiphenomenalist theories of consciousness - that consciousness is caused by neurons, but doesn't affect those neurons in turn? The classic argument for epiphenomenal consciousness has always been that we can imagine a universe in which all the atoms are in the same place and people behave exactly the same way, but there's nobody home - no awareness, no consciousness, inside the brain. The usual effect of the brain generating consciousness is missing, but consciousness doesn't cause anything else in turn - it's just a passive awareness - and so from the outside the universe looks the same. Now, I'm not so much interested in whether you think epiphenomenal theories of consciousness are true or false - rather, I want to know if you think they're impossible or meaningless a priori based on your rules."

How would you reply?

Comment author: RomanDavis 12 October 2012 09:56:05AM *  -1 points [-]

I'm not sure. And am not sure how you would you do an experiment to check. My rules aren't data typed into a computer program on which the universe runs, they're descriptions of the universe as experienced through my senses and processed through my mind be things like "inference" and colored by things like the "expectation of beauty", and "Occam's Razor."

The reason I don't believe in the epiphenomenal theory of consciousness is because of the evidence against it, starting with my awareness, the existence of all this talk about awareness, and ending with fuzzier sort of thinking like, "Animals seem awake and aware and aware that they're aware."

Oh, that and saying that consciousness doesn't cause anything you can sense seem a violation of Occam's Razor, while consciousness not effecting anything, ever, even in principle, seems to be a rejection of causality itself.

Comment author: Eliezer_Yudkowsky 10 October 2012 05:49:27AM 1 point [-]

Koan 1:

"You say that a universe is a connected fabric of causes and effects. Well, that's a very Western viewpoint - that it's all about mechanistic, deterministic stuff. I agree that anything else is outside the realm of science, but it can still be real, you know. My cousin is psychic - if you draw a card from his deck of cards, he can tell you the name of your card before he looks at it. There's no mechanism for it - it's not a causal thing that scientists could study - he just does it. Same thing when I commune on a deep level with the entire universe in order to realize that my partner truly loves me. I agree that purely spiritual phenomena are outside the realm of causal processes, which can be scientifically understood, but I don't agree that they can't be real."

How would you reply?

Comment author: RomanDavis 12 October 2012 09:39:50AM *  0 points [-]

Well first of all, we're not perfect philosophers of perfect emptiness. We get our beliefs from somewhere. So it's true that all sorts of things are true that we have no evidence of. For instance, it's very, very likely there's life outside our solar system, but I don't have any evidence of it, so I act as if it's not true because in my model of the universe, it's very unlikely that that life will affect me during my natural lifetime.

I would even go far as to say that there may be matter beyond the horizon of the matter that expanded after the big bang, or that we're all running on an alien matrix, or that God is real but he's just hiding, and I act as if it's false. Not because they're untrue, or unlikely to be true, as I have no way to tell. But because I am very, very unlikely to ever, ever get evidence about any of those things, and they probably will never, and probably could never (especially in the near future) affect me. Not so much a "Nuh uh," as a "So what?"

You know your partner loves you based on evidence. If you have no evidence (from past experience or otherwise), then you are very likely wrong. Love operates according to mechanisms, and we understand some of those mechanisms.

Similarly, just because you don't understand the mechanism by which your psychic cousin works, doesn't mean there isn't one. He could be getting unbelievably lucky, or he could be playing a trick, or there could be things we don't know yet that really truly give him psychic powers. You don't know what the mechanism is, but you haven't really investigated either, have you? Even if you never find out what the mechanism is, how much evidence is that that there is no mechanism?

Lastly, I'm not sure, "no mechanism" even makes sense. What does it mean for something to have no mechanism? What does a thing that doesn't have a mechanism look like? How would you tell?

So, from the top: A Priori, Making Beliefs Pay Rent, No One Knows What Science Doesn't Know, What is Evidence?, Fragility of Value (Why something is unlikely to be true without evidence of it), Uh what was that one about you failing the art and not the other way around?, and Not Even Wrong.

Comment author: Burrzz 04 October 2012 04:06:05PM 0 points [-]

Roman, How you doing, staying dry I hope. Looks like you and I are the only ones in The Philippines.

What are you doing down there? I'm retired but I stay very active!

Comment author: RomanDavis 06 October 2012 07:30:37AM 0 points [-]

My Dad's a retired airforce officer. Living with him. right now. Studying nursing. I do some digital painting and programming and I'm going to see if I can make some money at it (online, wages are terrible here!).

Comment author: knb 19 September 2012 10:11:37AM *  1 point [-]

You're actually citing evidence that supports my position. Yudkowsky makes it explicit in his essay that he didn't "get it" before, but that he does now. That goes against The Last Psychiatrist's claim that everyone (everyone!) makes decisions as thought they believe in God

Comment author: RomanDavis 19 September 2012 11:45:34AM *  -1 points [-]

Not literally God, just faith in the idea that bad things above a certain threshold somehow aren't allowed to happen to you. Sometimes the power is thought to be in some other, real or unreal entity, like the state or the fed or democracy or science or whatever. And sometimes it's not. It's just a bias, floating around in your thoughts in ways you aren't terribly aware of.

He wasn't generalizing from one example. He cites many example of people talking and thinking like this.

I'm going to go ahead and take his side on this one. It's just a bias. It's a cognitive malfunction of your brain that you might be able to work your way around by reframing if you remain vigilantly aware of it, or you construct a formula (like an actuary would) and operate according to that formula with as little input from the relevant buggy software in your brain as possible, but the bias is still there. For the vast, vast, vast, majority of people that bias is here to stay.

Like scope sensitivity, I really don't think there's much fixing it (without upgrading the hardware) and I just basically don't believe people who think they have accomplished this via mental discipline. It's possible, but it seems extremely unlikely. What's more, a claim like that seems motivated by exactly the same kind of optimistic bias.

View more: Next