JamesAndrix comments on Conflicts Between Mental Subagents: Expanding Wei Dai's Master-Slave Model - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (79)
This theory seems to make a testable prediction: you will have less akrasia if your signaling requires you to reach your goal, not just show that you're working towards it. Looking at my life, I'm not sure if that's true.
All these systems evolved in the ancestral environment. "Go to the moon" might not even be parseable by other processes as something you can do or not do, or something that signals anything.