All of TheCosmist's Comments + Replies

Wow fantastic thank you for this excellent reply. Just out of curiosity, is there any question this "cult of rationality" doesn't have a "sequence" or a ready answer for? ;)

5Nisan
You're welcome. The FAQ says:
7benelliott
The sequences are designed to dissolve common confusions. By dint of those confusions being common, almost everybody falls into them at one time or another, so it should not be surprising that the sequences come up often in response to new questions.

So in other words you agree with Lovecraft that only egotism exists?

4David_Gerard
As I commented on What Would You Do Without Morality?: Without an intrinsic point to the universe, it seems likely to me that people would go on behaving with the same sort of observable morality they had before. I consider this supported by the observed phenomenon that Christians who turn atheist seem to still behave as ethically as they did before, without a perception of God to direct them. This may or may not directly answer your question of what's the correct moral engine to have in one's mind (if there is a single correct moral engine to have in one's mind - and even assuming what's in one's mind has a tremendous effect on one's observed ethical behaviour, rather than said ethical behaviour largely being evolved behaviour going back millions of years before the mind), but I don't actually care about that except insofar as it affects the observed behaviour.

Wha? There's no law of nature forcing all my goals to be egotistical. If I saw a kitten about to get run over by a train, I'd try to save it. The fact that insectoid aliens may not adore kittens doesn't change my values one bit.

(I'm new here and don't have enough karma to create a thread, so I am posting this question here. Apologies in advance if this is inappropriate.)

Here is a topic I haven’t seen discussed on this forum: the philosophy of “Cosmicism”. If you’re not familiar with it check Wikipedia, but the quick summary is that it’s the philosophy invented by H. P. Lovecraft which posits that humanity’s values have no cosmic significance or absolute validity in our vast cosmos; to some alien species we might encounter or AI we might build, our values would be as meaningless... (read more)

2sark
It's perhaps worthwhile pointing out that even as there is nothing to compel you to accept notions such as "cosmic significance" or "only egotism exists", by symmetry, there is also nothing to compel you to reject those notions (except for your actual values of course). So it really comes down to your values. For most humans, the concerns you have expressed are probably confusions, as we pretty much share the same values, and we also share the same cognitive flaws which let us elevate what should be mundane facts about the universe to something acquiring moral force. Also, it's worth pointing out that there is no need for your values to be "logically consistent". You use logic to figure out how to go about the world satisfying your values, and unless your values specify a need for a logically consistent value system, there is no need to logically systematize your values.

cousin_it and Vladimir_Nesov's replies are good answers; at the risk of being redundant, I'll take this point by point.

to some alien species we might encounter or AI we might build, our values would be as meaningless as the values of insects are to us.

The above is factually correct.

humanity’s values have no cosmic significance or absolute validity in our vast cosmos

The phrases "cosmic significance" and "absolute validity" are confused notions. They don't actually refer to anything in the world. For more on this kind of thing yo... (read more)

2Vladimir_Nesov
Read the sequences and you'll probably learn to not make the epistemic errors that generate this position, in which case I expect you'll change your mind. I believe it's a bad idea to argue about ideologies on object level, they tend to have too many anti-epistemic defenses to make it efficient or even productive, rather one should learn a load of good thinking skills that would add up to eventually fixing the problem. (On the other hand, the metaethics sequence, which is more directly relevant to your problem, is relatively hard to understand, so success is not guaranteed, and you can benefit from a targeted argument at that point.)

The standard reply here is that duh, values are a property of agents. I'm allowed to have values of my own and strive for things, even if the huge burning blobs of hydrogen in the sky don't share the same goals as me. The prospect of increasing entropy and astrophysical annihilation isn't enough to make me melt and die right now. Obligatory quote from HP:MOR:

"There is no justice in the laws of Nature, Headmaster, no term for fairness in the equations of motion. The universe is neither evil, nor good, it simply does not care. The stars don't care, or

... (read more)

Online poker is the most brutal rationality test I know of. As a one time “semi-professional” player, I experienced things which really strained my capacity to believe in a random universe. It would be amusing to watch someone like Eliezer Yudkowsky play poker; I can easily imagine him becoming an emotional, superstitious nut throwing keyboards against the wall like almost everyone else who plays poker long enough!

5Vladimir_Nesov
Well, if you can explain every outcome, you have zero knowledge. And your ignorance about Eliezer Yudkowsky is a fact about you, not a fact about Eliezer Yudkowsky. :-)