My name is Brent, and I'm probably insane.
I can perform various experimental tests to verify that I do not perform primate pack-bonding rituals correctly, which is about half of what we mean by "insane". This concerns me simply from a utilitarian perspective (separation from pack makes ego-depletion problems harder; it makes resources harder to come by; and it simply sucks to experience "from the inside"), but these are not the things that concern me most.
The thing that concerns me most is this:
What if the very tools that I use to make decisions are flawed?
I stumbled upon Bayesian techniques as a young child; I was lucky enough to have the opportunity to perform a lot of self-guided artificial intelligence "research" in Junior High and High School, due to growing up in a time and place when computers were utterly mysterious, so no one could really tell me what I was "supposed" to be doing with them - so I started making simple video games, had no opponents to play them against due to the aforementioned failures to correctly perform pack-bonding rituals, decided to create my own, became dissatisfied with the quality of my opponents, and suddenly found myself chewing on Hopfstaedter and Wiener and Minsky.
I'm filling in that bit of detail to explain that I have been attempting to operate as a rational intelligence for quite some time, so I believe that I've become very familiar with the kinds of "bugs" that I will tend to exhibit.
I've spent a very long time attempting to correct for my cognitive biases, edit out tendencies to seek comfortable-but-misleading inputs, and otherwise "force" myself to be rational, and often, the result is that my "will" will crack under the strain. My entire utility-table will suddenly flip on its head, and attempt to maximize my own self-destruction rather than allow me to continue to torture it with endlessly recursive, unsolvable problems that all tend to boil down to "you do not have sufficient social power, and humans are savage and cruel no matter how much you care about them."
Most of my energy is spent attempting to maintain positive, rational, long-term goals in the face of some kind of regedit-hack of my utility table itself, coming from somewhere in my subconscious that I can't seem to gain write-access to.
Clearly, the transhumanist solution would be to identify the underlying physical storage where the bug is occurring, and replace it with a less-malfunctioning piece of hardware.
Hopefully someday someone with more self-control, financial resources, and social resources than I will invent a method to do that, and I can get enough of a partial personectomy to create something viable with the remaining subroutines.
In the meantime, what is someone who wishes to be rational supposed to do, when the underlying hardware simply won't cooperate?
I'm pretty sure you're doing something wrong now. You're being very vague and not giving any examples so I can't troubleshoot anywhere near precisely, but clearly you're trying to fit a square peg into a round hole and looking for the right sledgehammer for the job. You're not Augustine of Hippo; you may endorse a set of rules as the Sacred Laws Of Rationality That You Are A Really Bad Person If You Don't Follow, but if trying to follow them causes breakdowns, you're just wrong about the rules. Taboo "rational", and ask if each rule is a maintainable habit, possible to use explicitly in extraordinary circumstances, and if you actually want to do that. ("My life is worth exactly as much as some random stranger's" sounds nice, but nobody can actually follow that long-term.)
You don't say what kind of insane you are. You mention lack of social skills and that it's a big problem for you, but that's nearly orthogonal. For the first time in history, people are publishing useful guides to social life. Of course any oddity is going to make it harder for you, but go to groups that share your interests, and you'll find more people whose personality meshes with yours and fewer who go "It's not making eye contract in the exact pattern I want! Burn the witch!". It helps to sincerely like the people you're trying to befriend, but a little dishonest manipulation can go a long way too.
If you have other insanity-related problems, I suggest you ask a psychiatrist for help with the root cause, tackle each problem directly, or start a Less Wrong mental health support group (so far the current procedure is "whine loudly enough to attract Alicorn's compassion", which might be a bit hard on Alicorn). Those bouts of self-destruction might be due to pushing too hard in bad directions, but might have other origins too.
"Ask Alicorn to put you in touch with Adelene" may be a viable alternative for chronic rather than acute cases. I'm pretty horrible at providing direct support, but I'm quite good at getting a feel for the shape of peoples' thought processes, both where they are and where they want to be, and using that information to connect them with resources that will help them move to... (read more)