If people use language that is hurtful, objectifying, and sexist,
There is no such thing as hurtful(language). There is only considered_hurtful_by(language, person). See Eliezer's post about movie posters with swamp creatures carrying off "sexy" women for explanation, aka the "mind projection fallacy".
Okay, I'll try it on you. I think I understand what you meant, so it's not okay for me to feel any way at all about how you said it
I didn't say it was "not okay" - I said it was "not useful". HUGE difference.
You are perfectly free to feel any way you like, but that doesn't make it useful, nor grant you any rights regarding whether others should agree with your feelings.
But when I have tried in the past to ask you what you mean, you have not been helpful.
IOW, "not_helpful_to(pjeby_answers, Alicorn_understanding)"... but note that this does not equate to "not_helpful(pjeby)" or "not_trying_to_help(pjeby)", just as "repulsive_to(X, Alicorn)" does not equate to "repulsive(X)" or "unethical(X)".
Perhaps what happened was that I accidentally misunderstood you and got into an argument.
Perhaps. I actually see it more as that people are trying to tell you things that are outside your current frame of reference, and you're telling them they're unethical or in error, when they are actually trying to be clear and helpful and say what they mean, and are puzzled why you're labeling them and their statements. (Even when someone knows male-female idiom translation inside and out, they don't always notice what they're doing, just like most people aren't aware of their own accent.)
Meanwhile, AFAICT, you are taking other people's words and translating them to what you would mean if you used those words, instead of graciously accepting others explanation of what they meant by those words.
Once you get to the point where you're arguing about the definitions of the words, there isn't really an argument any more -- something that also should be clear from Eliezer's past posts.
In short, none of the stuff I'm bringing up is "about" gender issues -- or I wouldn't even have bothered with this conversation in the first place.
I brought this up only because it's directly relevant to core Yudkowskian principles like the mind projection fallacy, arguing over definitions, and not treating one class of human being as a broken version of another class of human being.
In other words, it's about rationality.
I should chalk that up to you being male
That would be if -- and only if -- we had successfully reached understanding, and the misunderstanding was rooted in a gender-based language difference. (i.e., the context of my comments)
Do you have any evidence for the male-language/female-language thing? Isn't it at least as likely that men talk about concepts that offend women, and women talk about concepts that elude men? (I speak as a male)
The stuff you're talking about here is mainly communication problems. I'm not convinced you and alicorn are having a communication problem...
This article is a deliberate meta-troll. To be successful I need your trolling cooperation. Now hear me out.
In The Strangest Thing An AI Could Tell You Eliezer talks about asognostics, who have one of their arm paralyzed, and what's most interesting are in absolute denial of this - in spite of overwhelming evidence that their arm is paralyzed they will just come with new and new rationalizations proving it's not.
Doesn't it sound like someone else we know? Yes, religious people! In spite of heaps of empirical evidence against existence of their particular flavour of the supernatural, internal inconsistency of their beliefs, and perfectly plausible alternative explanations being well known, something between 90% and 98% of humans believe in the supernatural world, and is in a state of absolute denial not too dissimilar to one of asognostics. Perhaps as many as billions of people in history have even been willing to die for their absurd beliefs.
We are mostly atheists here - we happen not to share this particular delusion. But please consider an outside view for a moment - how likely is it that unlike almost everyone else we don't have any other such delusions, for which we're in absolute denial of truth in spite of mounting heaps of evidence?
If the delusion is of the kind that all of us share it, we won't be able to find it without building an AI. We might have some of those - it's not too unlikely as we're a small and self-selected group.
What I want you to do is try to trigger absolute denial macro in your fellow rationalists! Is there anything that you consider proven beyond any possibility of doubt by both empirical evidence and pure logic, and yet saying it triggers automatic stream of rationalizations in other people? Yes, I pretty much ask you to troll, but it's a good kind of trolling, and I cannot think of any other way to find our delusions.