Comment author: gwillen 03 April 2016 07:34:08AM 1 point [-]

Hmm, I'd previously thought it was always a pejorative term; now (after checking out Wikipedia) I have the impression that it was originally a positive self-identification, now primarily pejorative in modern usage. So I don't really know what to think about it it anymore.

Comment author: Articulator 11 April 2016 07:26:18AM 1 point [-]

They've done a really good job of making it a pejorative. Anything's a slur if you hate them enough.

Comment author: Articulator 11 April 2016 06:05:06AM *  0 points [-]

I mean, charitably speaking, I imagine that the second-to-last paragraph could easily have been an argument from consequences, rather than rape apology.

The parable doesn't really characterize the boy as right, rather as desperate. I don't think that it's unreasonable to make an argument that some rapists are desperate for sex, nor that if fewer men were desperate for sex, there'd be less rape. Not saying it's true necessarily, but that it's at least arguable. That doesn't mean women should be forced into sex, of course, but it could still be true at the same time that there would be less rape if men weren't so desperate.

Maybe it's because I identify with the boy to an extent, but I don't think that this is really a moral piece, rather an emotional piece. This is the boy's journey, his perception. I'm sure that it could describe many people reasonably accurately. I will note that the author narrates, but does not pass judgement through narration, only characters.

I think that some people here might be having so much trouble with this because they think that feeling bad for the boy means that women should be forced to have sex, and resent being forced to agree one way or the other. This is a wrong question.

  • You can feel sorry for the boy and not condone the second-to-last paragraph, whether it actually symbolized rape or not

  • You can feel sorry for the boy, even if you don't think it would be wrong for him to never "have the branch lifted"

  • You can feel sorry for the boy and still condemn any other part of this story

Reasonable responses:

  • "I wish you didn't have to feel that way."

  • "I feel sorry for you, but that doesn't mean I will have sex with you."

  • "I feel sorry for you, but that doesn't justify rape."

There are a lot of false dichotomies of blame to fall into here, especially given that this is a parable, and a highly charged one at that. Please try to avoid them.

To the people who suggest that one finds other ways of coping, I look forward to you putting your money where your mouth is and being celibate for 20-40 years to show us the way. While this is a decidedly less black and white topic than most minority disputes, the idea that a member of the outgroup should claim to know the experience of the ingroup better than the ingroup is one that is a very common (and incredibly rude) fallacy, so I should certainly hope that no one falls for that trap, especially if you are part of another minority.

Comment author: elharo 20 May 2013 04:34:13PM *  0 points [-]

Hypothesis 8, male variance in IQ, is irrelevant to the extent that this site is about rationality, not IQ. Whatever IQ tests measure, it is neither instrumental nor epistemic rationality. See What Intelligence Tests Miss: The Psychology of Rational Thought by Keith E. Stanovich for extensive discussion of this point. Even if there is male-female variance in IQ, that does not imply a male-female variance in rationality.

Comment author: Articulator 01 June 2015 02:48:16AM 3 points [-]

Pretty sure that the average IQ on LessWrong is above the mean, though. Therefore, a group with higher variance is more likely to have member in LessWrong.

The causality of that statement is atrocious, but I think the overall picture should still come through.

Comment author: Astazha 01 March 2015 02:12:02AM 6 points [-]

Ch. 28:

Harry's knuckles had gone white on his wand by the time he stopped trying to Transfigure the air in front of his wand into a paperclip. It wouldn't have been safe to Transfigure the paperclip into gas, of course, but Harry didn't see any reason why it would be unsafe the other way around. It just wasn't supposed to be possible. But why not? Air was as real a substance as anything else...

Well, maybe that limitation did make sense. Air was disorganized, all the molecules constantly changing their relation to each other. Maybe you couldn't impose a new form on substance unless the substance was staying still long enough for you to master it, even though the atoms in solids were also constantly vibrating all the time...

This isn't conclusive, though. That failed attempt is before he sorts out partial transfiguration. However:

Harry may only use capabilities the story has already mentioned; he cannot develop wordless wandless Legilimency in the next 60 seconds. Of course, Harry may find more clever ways to use abilities he has already been established to have.

It would need to be fairly clear, I think, that Harry was re-purposing an old technique and not doing something new.

Comment author: Articulator 01 March 2015 07:37:33AM *  2 points [-]

The first rule of Transfiguration: you do not guess.

Harry proposed a hypothesis, but no further testing was committed. Without knowledge of PT, I'd rate the inability to transfigure all air (as a conceptually-singular entity) as an equally (or more) probable explanation.

Comment author: TobyBartels 01 March 2015 05:47:07AM *  1 point [-]

Read a comment above; he cannot transfigure air.

Comment author: Articulator 01 March 2015 07:23:35AM 2 points [-]

That was prior to PT.

Comment author: Articulator 20 November 2014 12:16:38AM 5 points [-]

This looks really interesting - do you have a timeframe on a playable demo, Kaj?

I sympathize with you on the Java - easier than most other methods, but oh god the lack of style. I think even just making those choice buttons a little less default (non-serif font, lose the blue shading) could move it a fair way toward being presentable.

My primary concern currently is that even if you have a robust engine to abstract much of the coding, this looks like it would have a very poor input to output time ratio. Do you have any plans for circumventing that, or do you have enough time to brute force it?

In response to Fundamental Doubts
Comment author: Articulator 29 March 2014 04:22:49AM 0 points [-]

[I'm probably going to be the latest in a long line of people saying something like this, but I hope my wording, at least, makes this worth existing.]

"I think, therefore I am" is, in fact, deductive reasoning. The definition of "I am", as thought in the first person, as far as we can comprehend it, means "I think".

"I think, therefore I think" Or, more simply, "I think". The statement itself, as we are thinking it, cannot possibly be false - no matter the Demons we posit, we cannot be in simultaneous states of comprehending and not thinking.

"I am", by our very definitions, must be true for everyone who is reading this, as you are reading it. Because you are reading it.

I'm sorry, Eliezer, but I think you are mistaken if you think you can disprove "I am", thought in the first person.

In response to Fundamental Doubts
Comment author: Tim_Tyler 12 July 2008 08:00:32AM 0 points [-]

Harvesting human body heat for energy might be stupid and inefficient way to use stored food - but I don't think it would violate the second law of thermodynamics. Maybe you are thinking that the flesh of the dead was the only source of food? That would indeed be even more silly - but the movie doesn't make that claim.

Comment author: Articulator 29 March 2014 04:10:14AM 0 points [-]

Fundamentally, the problem is that you need to get the energy somewhere. Currently, we get it indirectly from sunlight. In a world with no ability to obtain sunlight (as the justification of the Matrix goes), the second law means that barring geothermal (which doesn't require humans as a go-between), the total usable energy will decrease to zero.

It's like recycling. Can you ever expect to get better materials, or more materials than you started with without putting anything else in? (Including energy)

Comment author: Articulator 29 March 2014 03:55:06AM 0 points [-]

Thinking about this in terms of AGI, would it be reasonable to suggest that a bias must be created in favor of utilizing inductive reasoning through Bayes' Theorem rather than deductive reasoning when and if the two conflict?

In response to comment by [deleted] on The Moral Void
Comment author: DanielLC 20 June 2012 05:44:09AM 5 points [-]

You do realize that valuing equality in itself to any extent at all is always (because of opportunity cost at least) an example of this:

Are you sure?

If you take a concave function, such as a log, of the net happiness of each individual, and maximize the sum, you'd always prefer equality to inequality when net happiness is held constant, and you'd always prefer a higher minimum happiness regardless of inequality.

In response to comment by DanielLC on The Moral Void
Comment author: Articulator 27 March 2014 06:24:53AM 1 point [-]

Excellent! Thanks for the mathematical model! I've been trying to work out how to describe this principle for ages.

View more: Next