alexgieg
alexgieg has not written any posts yet.

At the moment, I haven't left the Church's community, so I don't feel that loss just yet.
There's a potential middle-way there.
I don't know much about Mormonism, mind, but I watch and read a Biblical scholar, Dan McClellan, who's skeptical of everything and then some. His YouTube channel, and other videos in which he appears, as well as his papers and books, are all in line with the academic consensus in Biblical scholarship, meaning he deconstructs every single Christian belief (and most Jewish ones too) to the point it's easy to assume he's a militant Atheist. But he's actually a practicing Mormon, and his intense criticism extends to the books of the Mormon... (read more)
Knowing truth doesn't provide, by itself, human connection. In the Mormon church you had a community, people with whom you interacted and had a common ground, shared interests, and collective goals. When one breaks with such a community, without having first established a new one, the result may be extreme loneliness.
The way to fix that is to find a new community. Many atheists and rationalists schedule periodic meetings to interact with each other and talk in person, so depending on your need of connection that might suffice. If not, there are church-like organizations that require no profession of faith and welcome atheists, which is particularly effective if one's been raised with church... (read more)
I'd say this is the point at which one starts looking into current state-of-the-art psychology (and some non-scientific takes too) to begin understanding all the variability in human behavior and cognition, and which kinds of advantages and disadvantages each provides from different perspectives, from the individual, to the sociological, to the evolutive.
Much of that disappointment is solved by that. Some of it deepens. The overall effect is a net positive though.
Unfortunately, they aren't rational. I developed this theme a little bit more in another reply, but to put it simply, in the US GAI is being pursued by insane individuals. No rational argument can stop someone who believes in that. And the other sides will try to protect themselves from these.
Admittedly, nuclear weapons are not a perfect analog for AI due to many reasons, but I think it is a reasonable analog.
We've had extreme luck when it comes to nuclear weapons. We not only had several close calls that were deescalated by particularly noble individuals doing the right thing, but also, back when the URSS had barely developed theirs and the US alone had a whole stockpile of warheads, we had the good luck of its leadership also being somewhat moral and refusing to turn nukes into a regular weapon, which was followed by MAD forcing everyone to kind of stay so even when the other side asked nicely whether they could... (read more)
I’m assuming that - and please correct me if I’m misinterpreting here - “extinguish” here means something along the lines of, “remove the ability to compete effectively for resources (e.g. customers or other planets)” not “literally annihilate”.
I wish that were the case, but my reference is imagining a paranoid M.A.D. mentality coupled with a Total War scenario unbounded by moral constraints, that is, all sides thinking all the other sides are X-risks to them.
In practice things tend not to get that bad most of the time, but sometimes they do, and much of military preparation concern mitigation of these perceived X-risks, the idea being that if "our side" becomes so powerful it... (read more)
Unfortunately, those in positions of power won't listen. From their perspective it's simply absurd to suggest that a system that currently directly causes, at most, a few dozen induced suicide deaths per year, may explode into death of all life. They have no instinctive, gut feeling for exponential growth, so it doesn't exist for them. And even if they acknowledge there's a risk, their practical reasoning moves more along arms-race lines:
"If we stop and don't develop AGI before our geopolitical enemies because we're afraid of a tiny risk of an extinction, they will develop it regardless, then one of two things happen: either global extinction, or our extinction in our enemies' hands.... (read more)
Indeed. I imagine it'd have to happen in four steps:
As you say, investigate each cognitive function independently. They won't show the kind of independency psychometrics prefers, since there are overlaps between the different functions, but it'd be a good start.
If that one proves robust, then investigate the axis between the introverted and extraverted modes of the four basic types. My hunch is these four axes would take the form of four bimodal distributions.
Then, if that one also proves robust, investigate the existence and distribution of stable stacks. There are 40,320 possible stacks considering all permutations of all eight functions. My hunch is we'd find a very long-tailed normal distribution, with a small number of common stacks in the ±98% range. Maybe those are the MBTI 16, maybe not.
And then, finally, if the "stacks exist" hypothesis proves valid, study them over long periods of time to observe whether they change, and how.
It then scores the answers across 4 axes
I've read about the MBTI for a while. Not in extreme depth, but also not via the simplifications provided by corporate heads. In depth enough to understand the basics of Jungian psychology on which the MBTI is based, though. So what I will say is likely going to differ significantly from what you learned in this course.
So, the most important thing is, the (real) MBTI four letters do not represent extremes on four different axes. That they do is one such simplification.
The core of the Jungian hypothesis on personality is that there are eight distinct cognitive functions, that is, eight basic ways the mind processes... (read 826 more words →)
This paper is far from a complete answer, but it may help: