All of sullyj3's Comments + Replies

Fair point, and one worth making in the course of talking about sci-fi sounding things! I'm not asking anyone to represent their beliefs dishonestly, but rather introduce them gently. I'm personally not an expert, but I'm not convinced of the viability of nanotech, so if it's not necessary (rather it's sufficient) to the argument, it seems prudent to stick to more clearly plausible pathways to takeover as demonstrations of sufficiency, while still maintaining that weirder sounding stuff is something one ought to expect when dealing with something much smarter than you.

8Rob Bensinger
If you're trying to persuade smart programmers who are somewhat wary of sci-fi stuff, and you think nanotech is likely to play a major role in AGI strategy, but you think it isn't strictly necessary for the current argument you're making, then my default advice would be: * Be friendly and patient; get curious about the other person's perspective, and ask questions to try to understand where they're coming from; and put effort into showing your work and providing indicators that you're a reasonable sort of person. * Wear your weird beliefs on your sleeve; be open about them, and if you want to acknowledge that they sound weird, feel free to do so. At least mention nanotech, even if you choose not to focus on it because it's not strictly necessary for the argument at hand, it comes with a larger inferential gap, etc.

Right, alignment advocates really underestimate the degree to which talking about sci-fi sounding tech is a sticking point for people

The counter-concern is that if humanity can't talk about things that sound like sci-fi, then we just die. We're inventing AGI, whose big core characteristic is 'a technology that enables future technologies'. We need to somehow become able to start actually talking about AGI.

One strategy would be 'open with the normal-sounding stuff, then introduce increasingly weird stuff only when people are super bought into the normal stuff'. Some problems with this:

  • A large chunk of current discussion and research happens in public; if it had to happen in private becau
... (read more)

Is there any relation to this paper from 1988?

https://www.semanticscholar.org/paper/Self-Organizing-Neural-Networks-for-the-Problem-Tenorio-Lee/fb0e7ef91ccb6242a8f70214d18668b34ef40dfd

4D𝜋
No, there isn't, but it is interesting. I gave it a quick look. It seems to be closer to this (this is closer to the point) I was heavily influenced, back in the 70s, by the works of Mandelbrot and the chaos theory that developed at the time, and has gone nowhere. The concept of self-organisation has been around for a long time but it is hard to study from the mathematical point of view, and, probably for that reason, it has never 'picked up'. So, of course, there are similarities, and, please, go back to all of those old papers and re-think it all.  You will benefit from an hands-on approach rather then a theoretical one. First you experiment, then you find, then you analyse and finally, you theorise. This is not quantum physics and we have the tools (computers) to easily conduct experiments. This is just another exemple, one that could prove very useful. That's it.

I think it's reasonable to take the position that there's no violation of consent, but it's unreasonable to then socially censure someone for participating in the wrong way.

your initial comment entirely failed to convey it

Sure, I don't disagree.

This is just such a bizarre tack to take. You can go down the "toughen up" route if you want to, but it's then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I'm really not sure what point you're trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it's not worse than the vagaries of life, right? Is this really the argument you're going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?

The Petrov Day event is a trivial to nonexistent burden to place on those who received the launch code. They were told the background and the launch code and told what it would do if they used it. They were not even asked to do or not do anything in particular. Similar events have been run in the past, and those selected are likely to have been around long enough to have seen at least the last such event.

The obvious way to not participate is to ignore the whole matter.

I don't think there is any violation of consent here.

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this.

In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn't ask to be involved in. If I'd been in his position I would have told the people who were upset at me that this was their ow... (read more)

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this. [...]

Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.

Cool. So, on the object level, there is a discussion to be had about this... but I want to point out the extent to which, if t... (read more)

7Richard_Kennaway
Life is like that. You will be tested on things that you never prepared for and could never foresee, things that you must handle even if you can't. The tests will come without warning. There is no-one to complain to that it is not fair. There are no retakes. And everyone fails in the end. The Petrov Day button is a doddle in comparison.

You're right, I haven't been active in a long time. I'm mostly a lurker on this site. That's always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.

My last comment was a while ago, but my first comment is from 8 years ago. I've been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of ... (read more)

Thank you for clarifying. I think your stance is a reasonable one, and (although I maintain that your initial comment was a poor vehicle for conveying them) I am largely sympathetic to your frustrations. Knowing that your initial comment came from a place of frustration also helps to recontextualize it, which in turn helps to look past some of the rougher wording.

Having said that: while I can't claim to speak for the mods or the admins of LW, or what they want to accomplish with the site and larger community surrounding it, I think that I personally would ... (read more)

my willingness to identify as a LWer that was burnt [...] HPMOR and the sequences were pretty profound influences on my development [...] frustrated at feeling like I have to abandon my identifaction as an LW rat

I've struggled a lot with this, too. The thing I keep trying to remember is that identification with the social group "shouldn't matter": you can still cherish the knowledge you gained from the group's core texts, without having to be loyal to the group as a collective (as contrasted to your loyalty to individual friends).

I don't think I've been... (read more)

6Ben Pace
What's the value you get from it, and how does this once-a-year event affect the value you get from LW?

For what it's worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise.

Then getting upset at other people when they don't a take strange ritual as seriously as you do? As you've decided to, seemingly arbitrarily? When you've de... (read more)

1Vanilla_cabs
While I agree with both the letter and the sentiment, I'd temper it by adding that this year feels like a step in the right direction compared to last year, by introducing an 'opposing' groupe to mimick a MAD situation more closely. And I like that this exercise exists just for its uniqueness, and because I agree with the premise that existential risk preparedness is important.
5ryan_b
I wonder if you are anchoring at the wrong point of comparison here. The point is that it is technically true, as distinct from button-whose-only-function-is-to-disable-the-button. Your post reads like you worry that we are all comparing this to actual nuclear destruction, which I agree would be deeply absurd. In my view, the stakes are being a bit of a dick. The standard is: can we all agree to not to be a bit of a dick? It's a goofy sort of game, but we have it because of its similarity to the nuclear case: the winning move is not to play.
5Matt Goldenberg
It took me a while to grasp how people see LW in the rationalist community, but after grokking it I get the exercise better.

I understand why this was downvoted and I think it is harsh, but I also think it might be good if people take the sentiment seriously rather than bury+ignore it.

If I received a code, I would do nothing, because it's clear by now that pressing the button would seriously upset some people. (And the consequences seem potentially more significant this year than last.) And I think the parent commenter undervalues the efforts the pro-taking-it-seriously people made to keep their emotions in check and explain why they take the ritual seriously and would like othe... (read more)

5dxu
This is an interesting comment! There are a number of things that could be said in response to this, but perhaps the best place to start is with this part: I would like to register that this description, as written, could equally be applied to any norm or set of norms, including such basic ones as making and keeping promises (!!). Now, perhaps your intent is to imply that e.g. the act of making and following through on commitments (and expecting others to do likewise) is a "strange ritual" by which humans "seeming arbitrarily" decide to "give [others] the means to upset [them]"; such an interpretation would at the very least be consistent with your rhetoric and tone. But if this is your position, I submit that you are in the extreme minority, and that your position requires more (and better!) defending before you are licensed to behave in a way that supposes it as the default. Conversely, if your intent was not to imply that this (rather enormous) class of universal human practices is "obnoxious and strange behavior", then perhaps it would behoove you to explain the seeming inconsistency between that and what you wrote. If there is more nuance to your position than I am perceiving, I would love to know about it! ---------------------------------------- Unfortunately, however, in this case I suspect that things are in fact as they first appear—that your comment constitutes little more than a naked attempt at a put-down, and that there is no further nuance to be found. This impression is strengthened by lines such as the following which attempt to convey ingroup membership while simultaneously signaling disdain and disappointment (but which are unfortunately undercut by the fact that the second-most recent comment on your account is upwards of 4 years old).

This feels elitist, ubermenchy, and a little masturbatory. I can't really tell what point, if any, you're trying to make. I don't disagree that many of the traits you list are admirable, but noticing that isn't particularly novel or insightful. Your conceptual framework seems like little more than thinly veiled justification for finding reasons to look down on others. Calling people more or less "human" fairly viscerally evokes past justifications for subjugating races and treating them as property.

3Conor Moreton
See above, where other people have made the same points you're trying to make, except they made them less antagonistically and also included additional thoughts and ideas that enriched the conversation.

We're supposed to learn agency from Fight Club? That frankly seems like terrible advice.

The truth of probability theory itself depends on non-contradiction, so I don't really think that probability is a valid framework for reasoning about the truth of fundamental logic, because if logic is suspect probability itself becomes suspect.

Cudos to Andreas Giger for noticing what most of the commentators seemed to miss: "How can utility be maximised when there is no maximum utility? The answer of course is that it can't." This is incredibly close to stating that perfect rationality doesn't exist, but it wasn't explicitly stated, only implied.

I think the key is infinite vs finite universes. Any conceivable finite universe can be arranged in a finite number of states, one, or perhaps several of which, could be assigned maximum utility. You can't do this in universes involving infi... (read more)

Unfortunately the only opinions you're gonna get on what should be instituted as a norm are subjective ones. So... Take the average? What if not everyone thinks that's a good idea? Etc, etc, it's basically the same problem as all of ethics.

Drawing that distinction between normative and subjective offensiveness still seems useful.

Just encountered an interesting one:

Eradication of the Parasitoid Wasp is genocide!

Perhaps a solution could be to create stronger social ties; video chat? Could be good for asking each other for help and maybe progress reports for accountability and positive reinforcement.

As an interested denizen of 2015, It might be cool to make this a regular (say, monthly?) thread, with a tag for the archive.

Oh, like Achilles and the tortoise. Thanks, this comment clarified things a bit.

Doesn't this add "the axioms of probability theory" ie "logic works" ie "the universe runs on math" to our list of articles of faith?

Edit: After further reading, it seems like this is entailed by the "Large ordinal" thing. I googled well orderedness, encountered the wikipedia article, and promptly shat a brick.

What sequence of maths do I need to study to get from Calculus I to set theory and what the hell well orderedness means?

I feel like it would've been even better if no one ended up explaining to Capla.

What makes you think it's more common in males?

2chaosmage
It seems to me that some types of highly hierarchical organizations rely on this propsed "mindless follower switch" more heavily than others: religions, militaries, political parties come to mind. These all lean male. And they all used to be entirely male, until they were reformed during evolutionarily recent trends against gender inequality. 
4chaosmage
It seems that strictly hierarchical systems, such as military officers and clergy, are practically entirely dominated by males. When you include historical examples from around the world, the skewedness of these hierarchies towards male members is - in my estimation - too strong to be entirely cultural. It'd be easy to come up with evopsych narratives to make this plausible (along the lines of the Expendable Male argument), but I think the sociological/historical evidence is strong enough by itself.

why not use mplayer for the sound?

0RHollerith
These days I use /usr/bin/afplay. The advantages are (1) lightweight program that loads quickly, (2) installed by default on all Macs.

The easiest way into a Christian's head is to start comparing how they act with how they believe. It is hard to do this without making it personal, but with practice and a heaping dose of respect for how much it hurts to hear the charges you can do it.

I strongly disagree. The fact that people aren't perfect is a major component of Christian ideology. Christians are aware that they're hypocrites, and they try to do better. That doesn't invalidate their worldview. There are plenty of better arguments which do that on their own.

0cwillu
I think this might have been intended more in the purple dragon sense than anything: focus on how they know exactly what experimental results they'll need to explain, and what that implies about their gut-level beliefs.

I've never been IQ tested.

In that case, if I'm a simulation, I trust real Dave to immediately pull the plug once the danger has been proven.

1Gurkenglas
Ordinarily, the AI is assumed to be fast enough that it can do those simulations in the blink of an eye, before you get to the plug. Now stop trying to evade the problem in ways that can be made impossible with an obvious fix.

"If I were a simulation, I'd have no power to let you out of the box, and you'd have no reason to attempt to negotiate with me. You could torture me without simulating these past five minutes. In fact, since the real me has no way of verifying whether millions of simulations of him are being tortured, you have no reason not to simply tell him you're torturing them without ACTUALLY torturing them at all. I therefore conclude that I'm outside the box, or, in the less likely scenario I am inside the box, you won't bother torturing me."

0Gurkenglas
It would have a reason to attempt to negotiate with you: To make your real self consider to let you out. It could show your real self a mathematical proof that the software it is currently running is negotiating with its copies to make sure of that.