John_Maxwell_IV comments on We Don't Have a Utility Function - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (123)
"Authority" isn't necessarily just one thing. For example, an all-powerful Friendly AI could choose to present itself in an extremely deferential way, and even conform exactly to it's human users' wishes. Being a central decisionmaker, projecting high status, having impressive accomplishments, having others feel instinctively deferential to you, and having others actually act deferential to you are all distinct but frequently related. I think at least some of these are worrisome (link).
If you increase the authority of a group's leader along all the dimensions of authority (which probably happens by default), I'd guess you get increased group coherence at the expense of decreased group rationality. You also run the risk of having the leader's preferences be satisfied at the expense of the group's preferences. In situations where it doesn't actually matter what you do much and it mostly just matters that everyone does it together in an orderly way, maybe this can be a good trade-off.