Thank you for your effort to understand. However, I don't believe this is in the right direction. I'm afraid I misunderstood or misrepresented my feelings about moral responsibility.
For thoroughness, I'll try to explain it better here, but I don't think it's such a useful clue after all. I hear physical materialists explaining that they still feel value outside an objective value framework naturally/spontaneously. I was reporting that I didn't -- for some set of values, the values just seemed to fade away in the absence of an objective value framework. However, I admit that some values remained. The first value to obviously remain was a sense of moral responsibility, and it was that value that kept me faithful to the others. So perhaps it is a so-called 'terminal value', in any case, it was the limit where some part of myself said "if this is Truth, then I don't value Truth".
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Something has been bothering me ever since I began to try to implement many of the lessons in rationality here. I feel like there needs to be an emotional reinforcement structure or a cognitive foundation that is both pliable and supportive of truth seeking before I can even get into the why, how and what of rationality. My successes in this area have been only partial but it seems like the better well structured the cognitive foundation is the easier it is to adopt, discard and manipulate new ideas.
I understand that is likely a fairly meta topic and would likely require at least some basic rationality to bootstrap into existence but I am going to try to define the problem. What is this necessary cognitive foundation? And then break it down into pieces. I suspect that much of this lies in subverbal emotional and procedural cues but if so how can they be more effectively trained?