The counter-concern is that if humanity can't talk about things that sound like sci-fi, then we just die. We're inventing AGI, whose big core characteristic is 'a technology that enables future technologies'. We need to somehow become able to start actually talking about AGI.
One strategy would be 'open with the normal-sounding stuff, then introduce increasingly weird stuff only when people are super bought into the normal stuff'. Some problems with this:
This is just such a bizarre tack to take. You can go down the "toughen up" route if you want to, but it's then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I'm really not sure what point you're trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it's not worse than the vagaries of life, right? Is this really the argument you're going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?
The Petrov Day event is a trivial to nonexistent burden to place on those who received the launch code. They were told the background and the launch code and told what it would do if they used it. They were not even asked to do or not do anything in particular. Similar events have been run in the past, and those selected are likely to have been around long enough to have seen at least the last such event.
The obvious way to not participate is to ignore the whole matter.
I don't think there is any violation of consent here.
I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this.
In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn't ask to be involved in. If I'd been in his position I would have told the people who were upset at me that this was their ow...
I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this. [...]
Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.
Cool. So, on the object level, there is a discussion to be had about this... but I want to point out the extent to which, if t...
You're right, I haven't been active in a long time. I'm mostly a lurker on this site. That's always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.
My last comment was a while ago, but my first comment is from 8 years ago. I've been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of ...
Thank you for clarifying. I think your stance is a reasonable one, and (although I maintain that your initial comment was a poor vehicle for conveying them) I am largely sympathetic to your frustrations. Knowing that your initial comment came from a place of frustration also helps to recontextualize it, which in turn helps to look past some of the rougher wording.
Having said that: while I can't claim to speak for the mods or the admins of LW, or what they want to accomplish with the site and larger community surrounding it, I think that I personally would ...
my willingness to identify as a LWer that was burnt [...] HPMOR and the sequences were pretty profound influences on my development [...] frustrated at feeling like I have to abandon my identifaction as an LW rat
I've struggled a lot with this, too. The thing I keep trying to remember is that identification with the social group "shouldn't matter": you can still cherish the knowledge you gained from the group's core texts, without having to be loyal to the group as a collective (as contrasted to your loyalty to individual friends).
I don't think I've been...
For what it's worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise.
Then getting upset at other people when they don't a take strange ritual as seriously as you do? As you've decided to, seemingly arbitrarily? When you've de...
I understand why this was downvoted and I think it is harsh, but I also think it might be good if people take the sentiment seriously rather than bury+ignore it.
If I received a code, I would do nothing, because it's clear by now that pressing the button would seriously upset some people. (And the consequences seem potentially more significant this year than last.) And I think the parent commenter undervalues the efforts the pro-taking-it-seriously people made to keep their emotions in check and explain why they take the ritual seriously and would like othe...
This feels elitist, ubermenchy, and a little masturbatory. I can't really tell what point, if any, you're trying to make. I don't disagree that many of the traits you list are admirable, but noticing that isn't particularly novel or insightful. Your conceptual framework seems like little more than thinly veiled justification for finding reasons to look down on others. Calling people more or less "human" fairly viscerally evokes past justifications for subjugating races and treating them as property.
Cudos to Andreas Giger for noticing what most of the commentators seemed to miss: "How can utility be maximised when there is no maximum utility? The answer of course is that it can't." This is incredibly close to stating that perfect rationality doesn't exist, but it wasn't explicitly stated, only implied.
I think the key is infinite vs finite universes. Any conceivable finite universe can be arranged in a finite number of states, one, or perhaps several of which, could be assigned maximum utility. You can't do this in universes involving infi...
Unfortunately the only opinions you're gonna get on what should be instituted as a norm are subjective ones. So... Take the average? What if not everyone thinks that's a good idea? Etc, etc, it's basically the same problem as all of ethics.
Drawing that distinction between normative and subjective offensiveness still seems useful.
Doesn't this add "the axioms of probability theory" ie "logic works" ie "the universe runs on math" to our list of articles of faith?
Edit: After further reading, it seems like this is entailed by the "Large ordinal" thing. I googled well orderedness, encountered the wikipedia article, and promptly shat a brick.
What sequence of maths do I need to study to get from Calculus I to set theory and what the hell well orderedness means?
The easiest way into a Christian's head is to start comparing how they act with how they believe. It is hard to do this without making it personal, but with practice and a heaping dose of respect for how much it hurts to hear the charges you can do it.
I strongly disagree. The fact that people aren't perfect is a major component of Christian ideology. Christians are aware that they're hypocrites, and they try to do better. That doesn't invalidate their worldview. There are plenty of better arguments which do that on their own.
"If I were a simulation, I'd have no power to let you out of the box, and you'd have no reason to attempt to negotiate with me. You could torture me without simulating these past five minutes. In fact, since the real me has no way of verifying whether millions of simulations of him are being tortured, you have no reason not to simply tell him you're torturing them without ACTUALLY torturing them at all. I therefore conclude that I'm outside the box, or, in the less likely scenario I am inside the box, you won't bother torturing me."
Fair point, and one worth making in the course of talking about sci-fi sounding things! I'm not asking anyone to represent their beliefs dishonestly, but rather introduce them gently. I'm personally not an expert, but I'm not convinced of the viability of nanotech, so if it's not necessary (rather it's sufficient) to the argument, it seems prudent to stick to more clearly plausible pathways to takeover as demonstrations of sufficiency, while still maintaining that weirder sounding stuff is something one ought to expect when dealing with something much smarter than you.