trying to make it look like belief in witchcraft is very similar to belief in viruses.
I feel like you're missing the point. Of course, the germ theory of disease is superior to 'witchcraft.' However, in the average person's use of the term 'virus,' the understanding of what is actually going on is almost as shallow as 'witchcraft.' Of course, 'virus' does point towards a much deeper and important scientific understanding of what is going on, but in its every day use, it serves the same role as 'witchcraft.'
The point of the quote is that sometimes, when ...
I didn't mean to make 1. sound bad. I'm only trying to put my finger on a crux. My impression of most prosaic alignment work seems to be that they have 2. in mind, even though MIRI/Bostrom/LW seem to believe that 1. is actually what we should be aiming towards. Do prosaic alignment people think that work on human 'control' now will lead to scenario 1 in the long run, or do they just reject scenario 1?
I'm just confused about what "optimized for leaving humans in control" could even mean? If a Superintelligence is so much more intelligent than humans that it could find a way, without explicit coercion, for humans to ask it to tile the universe with paper-clips, then "control" seems like a meaningless concept. You would have to force the Superintelligence to treat the human skull, or whatever other boundary of human decision making, as some kind of unviolable and uninfluenceable black box.
I'm a little worried about what might happen if different parts of the community end up with very different timelines, and thus very divergent opinions on what to do.
It might be useful if we came up with some form of community governance mechanism or heuristics to decide when it becomes justified to take actions that might be seen as alarmist by people with longer timelines. On the one hand, we want to avoid stuff like the unilateralist’s curse, on the other, we can't wait for absolutely everyone to agree before raising the alarm.
One probably-silly idea: We could maybe do is some kind of trade. Long-timelines people agree to work on short-timelines people's projects over the next 3 years. Then if the world isn't destroyed, the short-timelines people work for the long-timelines people's projects for the following 15 years. Or something.
My guess is that the details are too fraught to get something like this to work (people will not be willing to give up so much value), but maybe there's a way to get it to work.
For China, the Taliban and the DPRK, I think Fukuyama would probably argue that they don't necessarily disprove his theses, but it's just that it's taking much longer for them to liberalize than he would have anticipated in the 90s (he also never said that any of this was inevitable).
For Mormons in Utah, I don't think they really pose a challenge, since they seem to quite happily exist within the framework of a capitalist liberal democracy.
Technology, and AGI in particular, is indeed the most credible challenge and may force us to reconsider some high-stak...
Instrumentally, an invisible alpha provides a check on the power of the actual alpha. A king a few centuries ago may have had absolute power, but he still couldn't simply act against what people understood to be the will of the actual alpha (God).
Thank you. I really appreciate this clarification.
I meant God Is Great as a strong endorsement of LessWrong. I am aware that establishing an analogy with religion is often used to discredit ideas and movements, but one of the things I want to push back against is that this move is necessarily discrediting. But this requires a lot of work (historical background on how religions got to occupy the place they do today within culture, classical liberal political philosophy...) on my part to explain why I think so, and why in the case of EA/LW, I think the compa...
No, it's just that we've rejected the concept of "God" as wrong, i.e. not in accordance with reality. Some ancient questions really are solved, and this is one of them. Calling reality "God" doesn't make it God, any more than calling a dog's tail a leg makes it a leg. The dog won't start walking on it.
The disagreement here seems to be purely over definitions? The way I use "God" to mean "reality" is the same way Scott Alexander uses "Moloch" to mean "coordination failure" or how both Yudkowsky and Scott Alexander have used "God" to mean "evolution."
...Th
The disagreement here seems to be purely over definitions? The way I use "God" to mean "reality" is the same way Scott Alexander uses "Moloch" to mean "coordination failure" or how both Yudkowsky and Scott Alexander have used "God" to mean "evolution."
Umm, ok? Using misleading terms and then complaining that you don't generate good discussion seems unlikely to succeed here.
I don't know, but a straightforward propositional post (using more standard terms, or using a lot of non-poetic words to define rather than describe your points) might get some goo...
Thank you, I find this comment quite constructive.
My understanding of neuroscience has convinced me that consciousness is fundamentally dependent on the brain.
I had a similar journey.
The Durkheimian "society worshiping itself" phenomenon is real, common, and by no means limited to religion as traditionally defined. It is often wildly irrational and is pretty much the opposite of what LW aspires to.
I guess this can take a pretty nasty and irrational form, but I see this continuous with other benign community bonding rituals and pro-social behavior (like Petrov day or the solstice).
my impression was that in your post the payload is missing
Okay, that seems fair. It is true that just from that post, it's unclear what my point is (see hypothesis 1).
I think it matters how we construst our mythical analogies, and in Scott Alexander's Moloch, he argues that we should "kill God" and replace it with Elua, the god of human values. I think this is the wrong way to frame things. I assume that Scott uses 'God' to refer to the blind idiot god of evolution. But that's a very uncharitable and in my opinion unproductive way of constructing our my...
No, he is making a different and more precise claim.
There is a phenomenon that can be called "anti-epistemology." This is a set of social forces that penalize or otherwise impede clear thought and speech.
Sometimes, a certain topic in a certain space is free of anti-epistemology. It is relatively easy to think about, research, and discuss it clearly. A central example would be the subject of linear algebra in the context of a class on linear algebra.
Other times, anti-epistemology makes thought, research, and discussion difficult for a certain topic in a cer...
'Invisible alpha' seems like a big step up over actual alpha on the ladder of cultural evolution.
In the end, reality itself has always been the ultimate arbiter of any claim to truth or authority.
You could think of the aim of this post as trying to steelman theism to a rationalist, while simultaneously steelmanning EA/rationalism/... to a theist.
Why “God?” Part of this exercise is to examine how people have used the word “God” throughout history, look at what purpose the concept has served, and, for example, observe how similar the ‘God’ concept is to the rationalist ‘reality’ concept. Arguably, the way people used ‘God’ a thousand years ago is closer to our “reality” concept than to the way many people use ‘God’ today. It is interesting to see what happens when you decide to take certain compatibilist definitions of God seriously.
This doesn't feel like it's really engaging at all with the content of the post. I don't mention "legitimacy" 10 times for nothing.