sullyj3
sullyj3 has not written any posts yet.

Right, alignment advocates really underestimate the degree to which talking about sci-fi sounding tech is a sticking point for people
Is there any relation to this paper from 1988?
https://www.semanticscholar.org/paper/Self-Organizing-Neural-Networks-for-the-Problem-Tenorio-Lee/fb0e7ef91ccb6242a8f70214d18668b34ef40dfd
I think it's reasonable to take the position that there's no violation of consent, but it's unreasonable to then socially censure someone for participating in the wrong way.
your initial comment entirely failed to convey it
Sure, I don't disagree.
This is just such a bizarre tack to take. You can go down the "toughen up" route if you want to, but it's then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I'm really not sure what point you're trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it's not worse than the vagaries of life, right? Is this really the argument you're going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?
I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this.
In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn't ask to be involved in. If I'd been in his position I would have told the people who were upset at me that this was their own problem and they could quite frankly fuck off.
Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.
You're right, I haven't been active in a long time. I'm mostly a lurker on this site. That's always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.
My last comment was a while ago, but my first comment is from 8 years ago. I've been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of friends there.
So, you can dismiss me... (read more)
For what it's worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise.
Then getting upset at other people when they don't a take strange ritual as seriously as you do? As you've decided to, seemingly arbitrarily? When you've deliberately given them the means to upset you? It's tantamount to emotional blackmail. It's just obnoxious and strange behaviour.
As a trust building exercise, this reduces my confidence in the average lesswronger's ability to have perspective about how important things are, and to be responsible for their own emotional wellbeing.
This feels elitist, ubermenchy, and a little masturbatory. I can't really tell what point, if any, you're trying to make. I don't disagree that many of the traits you list are admirable, but noticing that isn't particularly novel or insightful. Your conceptual framework seems like little more than thinly veiled justification for finding reasons to look down on others. Calling people more or less "human" fairly viscerally evokes past justifications for subjugating races and treating them as property.
Fair point, and one worth making in the course of talking about sci-fi sounding things! I'm not asking anyone to represent their beliefs dishonestly, but rather introduce them gently. I'm personally not an expert, but I'm not convinced of the viability of nanotech, so if it's not necessary (rather it's sufficient) to the argument, it seems prudent to stick to more clearly plausible pathways to takeover as demonstrations of sufficiency, while still maintaining that weirder sounding stuff is something one ought to expect when dealing with something much smarter than you.