I'm a very confused person trying to become less confused. My history as a New Age mystic still colors everything I think even though I'm striving for rationality nowadays. Here's my backstory if you're interested.
Well, as I attempted to express in the original comment, I am not a Shaiva, but rather I had mystical experiences and things as a teen that led me to invent my own religion from scratch which has similarities with various other belief systems, and Kashmir Shaivism is one of them. For the most part however it's just a kind of background element of my existence, part of my ontology, and not something I put much attention towards actively anymore. In practice I'm effectively an atheist physicalist like everyone else here. It's just... there's also something that lurks beneath. Or there used to be. I've gotten more disillusioned, more empty-souled and this-worldly as I've gotten older, and I don't really know how to get back the way I used to feel. Probably psychedelics is the only way; meditation doesn't do anything for me.
I think you're wrong that Level 4 is rare. It describes everyday reality in the upper strata of any cult - the psychopathic leadership class who make up shit for everyone else to believe, or claim to believe for status, or etc, and compete for status among those followers. And there are a LOT of cults in modern society, including organizations not traditionally perceived as cults, such as political ideologies.
I'm sure you've already thought of this, and I know nothing about this area of biology, but isn't it possible that the genes coding for intelligence more accurately code for the developmental trajectory of the brain, and that changing them would not in fact affect an adult brain?
I think people have "criticized" Minecraft for being unclear what the point is, and being more of a toy or sandbox than a "game."
Myself included. I can't play Minecraft; it's far too open-ended, and makes me feel anxious and overwhelmed. "Wtf am I supposed to do??" I want a game to give me two or three choices max at every decision point, not an infinite vector space of possibilities through which I cannot sort or prioritize.
This post though is about one of my big obsessions: trying to figure out how to design a game (computer or tabletop or both) which makes cooperation fun. And I mean, fun in the way Diablo is fun. Addictive, power fantasy feeling, endless sequence of dopamine hits, sexy. The problem is that the only way to produce that Diablo-flow is to enable people to act automatically, reacting to signs and triggers with preprogrammed responses so that they can sink down into the animalistic part of their brain that hunts and stalks and pounces on things to tear them apart without thought or simulation.
But learning to negotiate with others is the exact opposite of that, and is the main reason we have the effort-intensive simulation system to begin with - so the problem of making a game that is simultaneously compelling on a primal level, and centers on conflict resolution rather than just conflict... is hard.
The only thing that seems to have the same kind of flow in it to me is dance and other group rituals (such as those in religion that hasn't ossified to mere passionless false beliefs yet), which don't really help with the whole "training negotiation" thing (though they do induce people to align with one another on an emotional level) and also cannot easily be turned into video games or TTRPGs.
I never actually said that all these notions are constructed and fake, only that some are. Clearly some aren't. There are false positives and false negatives. I feel as if you're arguing against a straw man here.
If I were Bob I'd have told her to fuck off long ago and stopped letting some random person berate me for being lazy just like my parents always have. This is basically guilt-tripping, not a beneficial way of approaching any kind of motivation, and it is absolutely guaranteed to produce pushback. But then, I'm probably not your target audience, am I?
Btw just to be clear, I think Said Achmiz explained my reaction better than I, who habitually post short reddit-tier responses, can. My specific issue is that Alice seems to be acting as if it's any of her business what Bob does. It is not. Absolutely nobody likes being told they're not being ethical enough. It's why everyone hates vegans. As someone who doesn't like experiencing such judgmental demands, I would have the kneejerk emotional reaction to want to become less of an EA just to spite her. (I would not of course act on this reaction, but I would start finding EA things to be in an ugh field because they remind me of the distress caused by this interaction.)
Holy heck I have been enlightened. And by contemplating nothingness too! Thanks for the clarification, it all makes sense now.
I really enjoy this sequence but there's a sticking point that's making me unable to continue until I figure it out. It seems to me rather obvious that... utility functions are not shift-invariant! If I denominate option A at 1 utilon and option B at 2 utilons, that means I am indifferent between a certain outcome of A and a 50% probability of B - and this is no longer the case if I shift my utility function even slightly. Ratios of utilities mean something concrete and are destroyed by translation. Since your entire argument seems to rest on that inexplicably not being the case, I can't see how any of this is useful.
I understand all this logically, but my emotional brain asks, "Yeah, but why should I care about any of that? I want what I want. I don't want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort."
When wireheading - real wireheading, not the creepy electrode in the brain sort that few people would actually accept - is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with "real life" difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it's hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to "grow" (from other people's perspective, by other people's definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I'm actually a pretty hard person to guilt into doing things.
I am not the best at writing thorough comments because I am more of a Redditor than a LessWronger, but I just want you to know that I read the entire post over the course of ~2.5 hours and I support you wholeheartedly and think you're doing something very important. I've never been part of the rationalist "community" and don't want to be (I am not a rationalist, I am a person who strives weakly for rationality, among many other strivings), particularly after reading all this, but I definitely expected better out of it than I've seen lately. But perhaps I shouldn't; the few self-identified rationalists I've interacted with one on one have mostly seemed like... at best, very strange people to me. And Eliezer has always, honestly, struck me as a dangerous narcissist whose interest in truth is secondary to his interest in being the Glorious Hero. I don't want to go to the effort of replying to specific things you said - and you don't know who I am and probably won't read this anyway - but yeah, just, I'm glad you said them.