Marshall

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Yes Tim I deleted my account. Eliezer explained to me that I was not "ready" to comment om LW but I was welcome to continue reading. It is thus a little insulting, when Eliezer now says, I was "karma farming". I was contributing as best I could and consequently down-voted. I asked several times why I was down-voted and Eliezer himself answered by asking everyone to down vote me - just deserts because of "vagueness". Several of the articles I posted were never commented on. This does not sound like farming to me and why on earth would an adult man wish to collect pixel points? Eliezer's answer here reveals a moral weakness. And that is bad karma.

My first comment is a month old (as is Eliezer's original and now edited post). A month is a long time and in that time I was "hounded" out of LW. I think I was the first who experienced Eliezer's idea of policing the walled garden from vandalism and entropy. However to an outsider this looks more like a recipe for political correctness. In other words the uniformity of thought on LW is rather high and the place does seem rather boring. Motivated rationalists searching for a raison d'être - be it nusery rhymes, losing weight, procrastination or missionary behaviour with daring programmes for eradicating other's irrationality. In a sense it all rather seems like one big Multi-Player-Game in Language. Scoring points and distributing hits with rather transparent strategies for how to proceed. In my worst moments I sometimes think, that Eliezer has opened "the box" and the "thing" needs an army of obedient servants. LW would be the first place to start recruiting.

I definitely agree that LW's structure encourages participation (I have rarely contributed to OB) and the to and fro of comments gives valuable information on what/who people are in this "rationalist community" and where you stand.

However "the first comment you encounter is going to be something highly intelligent" is sales-talk and highly ridiculous. The first coment you encounter is just as likely to be runaway conformity. I would suggest that the pressure to conform is high and much of the intelligence is being used to signal logical dexterity on things with very little practical benefit.

It is my impression that LW is a tight community with little tolerance for what falls outside Eliezers definition of rationality and how rational people express themselves. I do not think this description will be accepted by Eliezer or the other contributors (and it would never be one of the first comments you met on a thread) but maybe they are Just Wrong.

Tim:-"would anyone else like to share what they think their utility function is?" Seem to have missed this question the first time around - and it looks like a good question. My timid answer is thus: To maximise the quality of (my) time. This is no trivial task and requires a balance between achieving things in the world, acquiring new information (thanks OB) and achieving new things in the world, peppered with a little bit of good biological feelings. Repeat.

Michael Vasar:- maybe you chose to work in an area, where you had to lie to survive. Perhaps Eli works in an area where the discovery of lying has a higher price (in destroyed reputation) than sticking to the inconvenient truth. But unfortunately I think it is easier to discount a truth-sayer (he is after all an alien) than a randomised liar (he is one of us). In other words it is easier to buy the mix of truth-and-untruth than the truth and nothing but the truth. But the social result seems to be the same - untruth wins.

Michael Vasar:- maybe you chose to work in an area, where you had to lie to survive. Perhaps Eli works in an area where the discovery of lying has a higher price (in destroyed reputation) than sticking to the inconvenient truth. But unfortunately I think it is easier to discount a truth-sayer (he is after all an alien) than a randomised liar (he is one of us). In other words it is easier to buy the mix of truth-and-untruth than the truth and nothing but the truth. But the social result seems to be the same - untruth wins.

Marshall-20

I think you are very productive Eliezer. Human Rationality is surely not tortured wheels squeekily running every second of the day - producing producing producing.

Human rationality should not and cannot be made into an assembley line.

Not Getting Things Done in a balance with GTD is important. Productivity is one of the big American lies.

Load More