Eugine_Nier comments on Poll - Is endless September a threat to LW and what should be done? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (259)
Here's two things we desperately need:
An authoritative textbook-style index/survey-article on eveything in LW. We have been generating lots of really cool intellectual work, but without a prominently placed, complete, hierarchical, and well-updated overview of "here's the state of what we know", we arent accumulating knowledge. This is a big project and I don't know how I could make it happen, besides pushing the idea, which is famously ineffective.
LW needs a king. This idea is bound to be unpopular, but how awesome would it be to have someone who's paid job it was to make LW into an awesome and effective community. I imagine things like getting proper studies done of how site layout/design should be to make LW easy to use and sticky to the right kind of people (currently sucks), contacting, coordinating, and encourageing meetup organizers individually (no one does this right now and lw-organizers has little activity), thinking seriously and strategically about problems like OP, and leading big projects like idea #1. Obviously this person would have CEO-level authority.
One problem is that our really high-power agent types who are super dedicated to the community (i.e. lukeprog) get siphoned off into SI. We need another lukeprog or someone to be king of LW and deal with this kind of stuff.
Without a person in this king role, the community has to waste time and effort making community-meta threads like these. Communities and democratic methods suck at doing the kind of strategic, centralized, coherent decision making that we really need. It really isn't the comparative advantage of the community to be having to manage these problems. If these problems were dealt with, it would be a lot easier to focus on intellectual productivity.
The standard term is Benevolent Dictator for Life, and we already have one. What you're asking for strikes me as more of a governor-general.
Our benevolent dictator isn't doing much dictatoring. If I understand correctly that it's EY, he has a lot more hats to wear, and doesn't have the time to do LW-managing full time.
As with god, If we observe a lack of leadership, it is irrelevant whether we nominally have a god-emperor or not. The solution is always the same: Build a new one that will actually do the job we want done.
Okay, that? That was one of the most awesome predicates of which I've ever been a subject.
You're defending yourself against accusations of being a phyg leader over there and over here, you're enjoying a comment that implies that either the commenter, or the people the commenter is addressing perceive you as a god? And not only that, but this might even imply that you endorse the solution that is "always the same" of "building a new one (god-emperor)".
Have you forgotten Luke's efforts to fight the perceptions of SI's arrogance?
That you appear to be encouraging a comment that uses the word god to refer to you in any way, directly or indirectly, is pretty disheartening.
I tend to see a fairly sharp distinction between negative aspects of phyg-leadership and the parts that seem like harmless fun, like having my own volcano island with a huge medieval castle, and sitting on a throne wearing a cape saying in dark tones, "IT IS NOT FOR YOU TO QUESTION MY FUN, MORTAL." Ceteris paribus, I'd prefer that working environment if offered.
And how are people supposed to make the distinction between your fun and signs of pathological narcissism? You and I both know the world is full of irrationality, and that this place is public. You've endured the ravages of the hatchet job and Rationalwiki's annoying behaviors. This comment could easily be interpreted by them as evidence that you really do fancy yourself a false prophet.
What's more is that I (as in someone who is not a heartless and self-interested reporter, who thinks you're brilliant, who appreciates you, who is not some completely confused person with no serious interest in rationality) am now thinking:
How do I make the distinction between a guy who has an "arrogance problem" and has fun encouraging comments that imply that people think of him as a god vs. a guy with a serious issue?
Try working in system administration for a while. Some people will think you are a god; some people will think you are a naughty child who wants to be seen as a god; and some people will think you are a sweeper. Mostly you will feel like a sweeper ... except occasionally when you save the world from sin, death, and hell.
I feel the same way as a web developer. One day I'm being told I'm a genius for suggesting that a technical problem might be solved by changing a port number. The next day, I'm writing a script to compensate for the incompetent failures of a certain vendor.
When people ask me for help, they assume I can fix anything. When they give me a project, they assume they know better how to do it.
The only way to decide whether someone has a serious issue is to read a bunch from them and then see which patterns you find.
I don't see this as a particular problem in this instance. The responses are of the form that if anything an indication that he isn't taking himself too seriously. The more pathologically narcissistic type tend to be more somber about their power and image.
No, if there was a problem here it would be if the joke was in poor taste. In particular if there were those that had been given the impression that Eliezer's power or Narcissism really was corrupting his thinking. If he had begun to use his power arbitrarily on his own whim or if his arrogance had left him incapable of receiving feedback or perceiving the consequences his actions have on others or even himself. Basically, jokes about how arrogant and narcissistic one is only work when people don't perceive you as actually having problems in that regard. If you really do have real arrogance problems then joking that you have them while completely not acknowledging the problem makes you look grossly out of touch and socially awkward.
For my part, however, I don't have any direct problem with Eliezer appreciating this kind of reasoning. It does strike me as a tad naive of him and I do agree that it is the kind of thing that makes Luke's job harder. Just... as far as PR missteps made by Eliezer this seems so utterly trivial as to be barely worth mentioning.
The way I make such distinctions is to basically ignore 'superficial arrogance'. I look at the real symptoms. The ones that matter and have potential direct consequences. I look at their ability to comprehend the words of others---particularly those others without the power to 'force' them to update. I look at how much care they take in exercising whatever power they do have. I look at how confident they are in their beliefs and compare that to how often those beliefs are correct.
I have to agree with Eliezer here: this is a terrible standard for evaluating phygishness. Simply put, enjoying that kind of comment does not correlate at all with what the harmful features of phygish organizations/social clubs, etc. are. There are plenty of Internet projects that refer to their most prominent leaders with such titles as God-King, "benevolent dictator" and the like; it has no implication at all.
You have more faith than I do that it will not be intentionally or unintentionally misinterpreted.
Also, I am interpreting at that comment within the context of other things. The "arrogance problem" thread, the b - - - - - - k, Eliezer's dating profile, etc.
What's not clear is whether you or I are more realistic when it comes to how people are likely to interpret, in not only a superficial context (like some hatchet jobbing reporter who knows only some LW gossip), but with no context, or within the context of other things with a similar theme.
srsly, brah. I think you misunderstood me.
I was drawing an analogy to Epicurus on this issue because the structure of the situation is the same, not because anyone perceives (our glorious leader) EY as a god.
I bet he does endorse it. His life's work is all about building a new god to replace the negligent or nonexistent one that let the world go to shit. I got the idea from him.
My response was more about what interpretations are possible than what interpretation I took.
Okay. There's a peculiar habit in this place where people say things that can easily be interpreted as something that will draw persecution. Then I point it out, and nobody cares.
Okay. It probably seems kind of stupid that I failed to realize that. Is there a post that I should read?
This is concerning. My intuitions suggest that it's not a big deal. I infer that you think it's a big deal. Someone is miscalibrated.
Do you have a history with persecution that makes you more attuned to it? I am blissfully ignorant.
I don't know if there's an explicit post about it. I picked it up from everything on Friendly AI, the terrible uncaringness of the universe, etc. It is most likely not explicitly represented as replacing a negligent god anywhere outside my own musings, unless I've forgotten.
I really like this nice, clear, direct observation.
Yes, but more relevantly, humanity has a history with persecution - lots of intelligent people and people who want to change the world from Socrates to Gandhi have been persecuted.
Here Eliezer is in a world full of Christians who believe that dreaded Satan is going to reincarnate soon, claim to be a God, promise to solve all the problems, and take over earth. Religious people have been known to become violent for religious reasons. Surely building an incarnation of Satan would, if that were their interpretation of it, qualify as more or less the ultimate reason to launch a religious war. These Christians outnumber Eliezer by a lot. And Eliezer, according to you, is talking about building WHAT?
My take on the "build a God-like AI" idea is that it is pretty crazy. I might like this idea less than the Christians probably do seeing as how I don't have any sense that Jesus is going to come back and reconstruct us after it does it's optimization...
I went out looking for myself and I just watched the bloggingheads video (6:42) where Robert Wright says to Eliezer "It sounds like what you're saying is we need to build a God" and Eliezer is like "Why don't we call it a very powerful optimizing agent?" and grins like he's just fooled someone and Robert Wright thinks and he's like "Why don't we call that a euphemism for God?" which destroys Eliezer's grin.
If Eliezer's intentions are to build a God, then he's far less risk-averse than the type of person who would simply try to avoid being burned at the stake. In that case the problem isn't that he makes himself look bad...
Like he's just fooled someone? I see him talking like he's patiently humoring an ignorant child who is struggling to distinguish between "Any person who gives presents at Christmas time" and "The literal freaking Santa Claus, complete with magical flying reindeer". He isn't acting like he has 'fooled' anyone or acting in any way 'sneaky'.
While I wouldn't have been grinning previously whatever my expression had been it would change in response to that question in the direction of irritation and impatience. The answer to "Why don't we call that a euphemism for God?" is "Because that'd be wrong and totally muddled thinking". When your mission is to create an actual very powerful optimization agent and that---and not gods---is actually what you spend your time researching then a very powerful optimization agent isn't a 'euphemism' for anything. It's the actual core goal. Maybe, at a stretch, "God" can be used as a euphemism for "very powerful optimizing agent" but never the reverse.
I'm not commenting here on the question of whether there is a legitimate PR concern regarding people pattern matching to religious themes having dire, hysterical and murderous reactions. Let's even assume that kind of PR concern legitimate for the purpose of this comment. Even then there is a distinct difference between "failure to successfully fool people" and "failure to educate fools". It would be the latter task that Eliezer has failed at here and the former charge would be invalid. (I felt the paragraph I quoted to be unfair on Eliezer with respect to blurring that distinction.)
Thank you. I will try to do more of that.
Interesting. Religious people seem a lot less scary to me than this. My impression is that the teeth have been taken out of traditional christianity. There are a few christian terrorists left in north america, but they seem like holdouts raging bitterly against the death of their religion. They are still in the majority in some places, though, and can persecute people there.
I don't think that the remains of theistic christianity could reach an effective military/propoganda arm all the way to Berkely even if they did somehow misinterpret FAI as an assault on God.
Nontheistic christianity, which is the ruling religion right now could flex enough military might to shut down SI, but I can't think of any way to make them care.
I live in Vancouver, where as far as I can tell, most people are either non-religious, or very tolerant. This may affect my perceptions.
This is a good reaction. It is good to take seriously the threat that an AI could pose. However, the point of Friendly AI is to prevent all that and make sure it that if it happens, it is something we would want.
I'm not sure I've heard any detailed analysis of the Friendly AI project specifically in those terms -- at least not any that I felt was worth my time to read -- but it's a common trope of commentary on Singularitarianism in general.
No less mainstream a work than Deus Ex, for example, quotes Voltaire's famous ""if God did not exist, it would be necessary to create him" in one of its endings -- which revolves around granting a friendly (but probably not Friendly) AI control over the world's computer networks.
ROT-13:
Vagrerfgvatyl, va gur raqvat Abeantrfg ersref gb, Uryvbf (na NV) pubbfrf gb hfr W.P. Qragba (gur cebgntbavfg jub fgvyy unf zbfgyl-uhzna cersreraprf) nf vachg sbe n PRI-yvxr cebprff orsber sbbzvat naq znxvat vgfrys (gur zretrq NV naq anab-nhtzragrq uhzna) cuvybfbcure-xvat bs gur jbeyq va beqre gb orggre shysvyy vgf bevtvany checbfr.
Why would you believe that something is always the solution when you already have evidence that it doesn't always work?
Let's go to the object level: in the case of God, the fact that god is doing nothing is not evidence that Friendly AI won't work.
In the case of EY the supposed benevolent dictator, the fact that he is not doing any benevolent dictatoring is explained by the fact that he has many other things that are more important. That prevents us from learning anything about the general effectiveness of benevolent dictators, and we have to rely on the prior belief that it works quite well.
There are alternatives to monarchy, and an example of a disappointing monarch should suggest that alternatives might be worth considering, or at the very least that appointing a monarch isn't invariably the answer. That was my only point.