That's common for beginners. If you want to give this a go, you should start by writing down fleeting, vague associations. "Something a bit sad or disappointing. A car. School and also not school. The texture of cinnamon rolls."
It doesn't matter that you can't remember anything concrete at first. Eventually, you'll remember more and more.
I don't agree with any of this. When I was really into lucid dreaming, I discovered that the best approach is two-fold: keep a detailed dream journal, and make a habit of performing reality checks. That's it. If you don't keep a dream journal, you'll likely have lucid dreams and just ... forget about them. And as for reality checks, my preferred one is trying to push my thumb through my palm. You can do it casually anywhere and it's an instant confirmation.
When I was actively trying to induce them, I often had periods where I had several lucid dreams per night. Gradually, I mastered dream flight (it's so weird how it's a skill), and I became better able to maintain my lucid state, which is often the most tricky part.
I was never interested in erotic, oneironautic adventures. I spent most of my time flying, which doesn't really get old.
This comment is going to sound mean. Just a fair warning.
This strikes me as a classic case of a guy thinking he's a prophet after doing a bunch of psychedelics. I've seen it over and over again. They are so convinced that they've "got it" that they often manage to convince others they do as well. You could call it the Messiah complex because, well, duh.
And you know what? Being around a bunch of people who are really nice to you feels good. And that feeling of it "clicking" is the feeling of your cognitive dissonance being wiped out by highly motivated reasoning. They're a bunch of loons, but I feel like I belong with them. I'm a genius, so if I belong with them they must be a bunch of geniuses as well. Oh! We're a bunch of geniuses! The Weird Spiritual Teachings are true!
Many incredibly smart scientists in Japan joined a doomsday cult (Aum Shinrikyo) because its members made them feel like they finally belonged somewhere. Loneliness is a hell of a drug. It's what gets you sucked into cults.
From what I've read, integral theory seems to be closer to a mysticist cult than a scientific framework. And I say this as someone who is quite open to process philosophy and systems science, both of which seem vaguely related to whatever integral theory is trying to be.
Not all who wander are lost.
I believe that the inner sense you are talking about is what we call love. We see the beauty around us, and we want to protect it. There are potential paths in front of us. There is a path whereby life is destroyed. There is a path whereby it is saved. Our mission is to keep it on the safe path, so that future generations can continue our mission when we are gone. We do this out of love. As we come to see that every living thing on earth depends on each other, our love grows so that it can embrace it all.
This is why we are willing to make sacrifices: what we are protecting is greater than all of us. Our life gains meaning and purpose when we find that it aligns with this mission.
We plant seeds today so that coming generations may enjoy the shade. That is our love.
All analogies rely on isomorphisms. They simply refer to shared patterns. A good analogy captures many structural regularities that are shared between two different things while a bad one captures only a few.
The field of complex adaptive systems (CADs) is dedicated to the study of structural regularities between various systems operating under similar constraints. Ant colony optimization and simulated annealing can be used to solve an extremely wide range of problems because there are many structural regularities to CADs.
I worry that a myopic focus will result in a lot of time wasted on lines of inquiry that have parallels in a number of different fields. If we accept that the problem of inner alignment can be formalized, it would be very surprising to find that the problem is unique in the sense that it has no parallels in nature. Especially considering the obvious general analogy to the problem of cancer which may or may not provide insight to the alignment problem.
What I have to offer is yet another informal perspective, but one that may further the search for formal approaches. The structure of the inner alignment problem is isomorphic to the problem of cancer. Cancer can be considered a state in which a cell employs a strategy which is not aligned with that of the organism or organ of which it belongs. One might expect, then, that advances in cancer research will offer solutions which can be translated in terms of AI alignment. In order for this to work, one would have to construct a dictionary to facilitate the process.
A major benefit of this approach would be the ability to leverage the efforts of some of the greatest scientists of our time working on solving a problem that is considered to be of high priority. Cancer research gets massive funding. Alignment research does not. If the problem structure is at least partly isomorphic, translation should be both possible and beneficial.
That only works if you reject determinism. If the initial conditions of the universe resulted in your decision by necessity, then it's not your decision, is it?
Moral realism:
I think determinism qualifies. Morality implies right versus wrong which implies the existence of errors. If everything is predetermined according to initial conditions, the concept of error becomes meaningless. You can't correct your behavior any more than an atom on Mars can; que sera, sera. Everything becomes the consequence of the initial conditions of the universe at large and so morality becomes inconsequential. You can't even change your mind on this topic because the only change possible is that dictated by initial conditions. If you imagine that you can, you do so because of the causal chain of events that necessitated it.
There's no rationality or irrationality either because these concepts imply, once again, the possibility of errors in a universe that can't err.
You're an atheist? Not your choice. You're a theist? Not your choice. You disagree with this sentiment? Again; que sera, sera.
How can moral realism be defended in a universe where no one is responsible for anything?
I do indeed make myself laugh at times. I think it has something to do with depth. The consequence of a line of thinking can be surprising, and that's probably relevant.
That's an interesting way of looking at it. Feynman had a hunch on the topic, which he shared in his Nobel Prize speech: nature is simple in some sense. We can describe things in many different ways without knowing that we're describing the same thing. Which, he said, is a sort of simplicity.
Even just writing down loose associations and your emotional state is enough; that's how you get the ball rolling. Try it for two weeks even if it feels useless. Unless you're taking antidepressants in which case this might actually be ineffective. I know this doesn't sound worthwhile, but I know from experience (mine and others) that it usually works.