Comment author: lavalamp 16 April 2009 06:05:12PM 4 points [-]

Hi, I've been lurking for a few weeks and am likely to stay in lurker mode indefinitely. But I thought I should comment on the welcome thread.

I would prefer to stay anonymous at the moment, but I'm male, 20's, BS in computer programming & work as a software engineer.

As an outsider, some feedback for you all:

Interesting topics -- keep me reading Jargon -- a little is fine, but the more there is, the harder it is to follow. The fact that people make go (my favorite game) references is a nice plus.

I would classify myself as a theist at the moment. As such (and having been raised in a very christian environment), I have some opinions on how you guys could more effectively proselytize--but I'm not sure it's worth my time to speak up.

Comment author: pnkflyd831 16 April 2009 08:51:03PM 2 points [-]

lava, You aren't the only one on LW that feels the same way. I have similar background and concerns. We are not outsiders. LW's dedication to attacking the reasoning of a post/comment, but not the person has been proved over and over.

Comment author: Yvain 22 March 2009 12:42:03PM *  67 points [-]

I read recently an article on charitable giving which mentioned how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity. And this seems a bit like the example you cited where, if blue cards came up randomly 75% of the time and red cards came up 25% of the time, people would bet on blue 75% of the time even though the optimal strategy is blue 100%. All this seems to come from concepts like "Don't put all your eggs in one basket", which is a good general rule for things like investing but can easily break down.

I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify".You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?" and then thinking "Eh, as soon as he stops recommending such good books, I'll stop reading them."

The other thing is the Outside View summed up by the proverb "If two people think alike, one of them isn't thinking." In the majority of cases I observe where a person conforms to all of the beliefs held by a charismatic leader of a cohesive in-group, and keeps praising that leader's incredible insight, that person is a sheeple and that leader has a cult (see: religion, Objectivism, various political movements). I respect the Outside View enough that I have trouble replacing it with the Inside View that although I agree with Eliezer about nearly everything and am willing to say arbitrarily good things about him, I'm certainly not a cultist because I'm coming to my opinions based on Independent Logic and Reason. I don't know any way of solving this problem except the hard way.

"note: Hofstadter does not have a cult"

I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.

Comment author: pnkflyd831 24 March 2009 12:35:42PM 0 points [-]

It would be great to add a link to the article on charitable giving you refer too to see if they already conclude or dismiss my idea on the issue. From observations of those around me I tend to see the reason behind charitable giving as something other than maximizing the utility of the charitable gift. I postulate that people give to many different charities as a social signal. The contributor is signaling to those who are receiving the gift that they sympathize with the cause. The contributor is also signaling to those around them that they are a caring and compassionate person. The quantity of the gift has an almost negligible effect on this signaling. So the more times someone gives, and the more charities they give too allows them to signal positive social mores more often and to a larger audience, increasing their social status higher, than if they gave all their expendable money to one charity a limited number of times.

Comment author: pnkflyd831 05 March 2009 09:02:33PM 0 points [-]

The true cost of acting rational is the difference between acting truly rational versus acting purely rational in a situation.

First we have to make the distinction between what is factual truth or strategically optimal and what an agent believes is true or strategically optimal. For non-rational agents these are different, there is at least one instance where what they believe is not what is true or how they act is not optimal. For rationalists, what they believe has to be proven true and they must act optimally.

A situation with only rational agents, they would find the optimal cumulative payoff and distribute it optimally in-line with potentially different goals. This assumes rational agents can agree on an optimal distribution strategy, possibly based on incurred costs (time, resources spent) and goal priority, and have non-conflicting goals.

Non-rational agents may have conflicting goals, be greedy and not achieve optimal distribution, and may not find or implement the strategy to achieve the best cumulative payoff. Each of these areas identifies costs of non-rationality.

In a situation in which rationalist agents must work with non-rationalist agents there will be a cost of non-rationality no matter how the rationalist acts. In these situations a pure rationalist will act differently than a true rationalist. Truly rational agents take into account both factual truth and non-rational agents beliefs and strategies when deciding strategy. The true rationalists will pursue strategies that achieve a sub-optimal but minimum cost. Purely rational agents only take into account factual truth when deciding strategy and does not account for non-rational agents beliefs or strategies. Pure rationalists will incur costs above this sub-optimal minimum. Thus, the true cost of acting rational is the difference between the cost of acting purely rational and acting truly rational.