Comment author: gelisam 06 February 2010 01:02:11AM *  8 points [-]

Reading that Argunet was both open source and flawed, I decided to download the code (signals geekiness) in order to fix the problem you have encountered (signals generosity). I didn't have to change anything: it took me a while (signals perseverance) but I finally figured out how to move a box in Argunet (signals superior problem-solving abilities).

It turns out that the result of dragging a box depends on whether the box was selected prior to dragging it. If the box was selected, it is moved, otherwise, an arrow is created. So, before you move a box, select it by clicking on it.

Comment author: Nanani 18 January 2010 07:13:35AM 3 points [-]

"Maximizing truth" doesn't make any sense. You can't maximize truth. You can improve your knowlege of the truth, but the truth itself is independent of your brain state.

In any case, when is untruth more instrumental to your utility function than truth? Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.

Comment author: gelisam 18 January 2010 07:27:07PM *  6 points [-]

You can't maximize truth.

I think it's fairly obvious that "maximizing truth" meant "maximizing the correlation between my beliefs and truth".

Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.

Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.

And before you complain that this doesn't make any sense either, let me spell out that is an estimate of the probability that the strategy "pursue truth first, happiness second" yields, on average, more hedons than "pursue happiness using the current set of beliefs".

Comment author: orthonormal 18 January 2010 08:25:14AM *  9 points [-]

Here's another set of downvotes I don't get (ETA: parent was at -2 when I arrived). Gelisam is just stating their personal experience, not in order to claim we must all do likewise, but as their own reaction to the debate.

I think this community would be ill served by a norm that makes it a punishable offense to ever admit one doesn't strive for truth as much as one ought.

As far as replies go:

I'd rather believe, quite simply, in whatever I need to believe in order to be happiest.

It's not so simple. If you're self-deceiving, you might be quite wrong about whether your beliefs actually make you happier! There's a very relevant post on doublethink.

Comment author: gelisam 18 January 2010 07:01:16PM 0 points [-]

Ah, so that's why people downvoted my comment! Thanks for explaining. I thought it was only because I appeared to be confusing utilons with hedons.

Regarding the doublethink post, I agree that I couldn't rationally assign myself false but beneficial beliefs, and I feel silly for writing that I could. On the other hand, sometimes I want to believe in false but beneficial beliefs, and that's why I can't pretend to be an aspiring rationalist.

In response to The Wannabe Rational
Comment author: mathemajician 16 January 2010 11:20:59AM 17 points [-]

There is nothing about being a rationalist that says that you can't believe in God. I think the key point of rationality is to believe in the world as it is rather than as you might imagine it to be, which is to say that you believe in the existence of things due to the weight of evidence.

Ask yourself: do you want to believe in things due to evidence?

If the answer is no, then you have no right calling yourself a "wannabe rationalist" because, quite simply, you don't want to hold rational beliefs.

If the answer is yes, then put this into practice. Is the moon smaller than the earth? Does Zeus exist? Does my toaster still work? In each case, what is the evidence?

If you find yourself believing something that you know most rationalists don't believe in, and you think you're basing your beliefs on solid evidence and logical reasoning, then by all means come and tell us about it! At that point we can get into the details of your evidence and the many more subtle points of rational reasoning in order to determine whether you really do have a good case. If you do, we will believe.

Comment author: gelisam 17 January 2010 05:15:15PM 4 points [-]

Uh-oh.

I... I don't think I do want to believe in things due to evidence. Not deep down inside.

When choosing my beliefs, I use a more important criterion than mere truth. I'd rather believe, quite simply, in whatever I need to believe in order to be happiest. I maximize utility, not truth.

I am a huge fan of lesswrong, quoting it almost every day to increasingly annoyed friends and relatives, but I am not putting much of what I read there into practice, I must admit. I read it more for entertainment than enlightenment.

And I take notes, for those rare cases in my life where truth actually is more important to my happiness than social conventions: when I encounter a real-world problem that I actually want to solve. This happens less often than you might think.

Comment author: gelisam 07 January 2010 09:11:53PM 5 points [-]

Oh, so that's what Eliezer looks like! I had imagined him as a wise old man with long white hair and beard. Like Tellah the sage, in Final Fantasy IV.

Comment author: Torben 13 December 2009 03:48:18PM 2 points [-]

Out of one thousand criminal trials in which the Less Wrong conventional wisdom gave the defendant a 35% chance of being guilty, you would expect to be able to correctly determine guilt nine hundred ninety nine times?

Maybe I'm missing something, but I think you read that wrong.

komponisto said the evidence should not cause anyone to change the prior probability much. Surely, for people in AK's reference class, the per-year probability of committing a 3-party sex killing is less than 0.001?

I think komponisto quite correctly described the effect of privileging the hypothesis, which might be what caused the LW community to be so much off from his estimate. Everybody seemed to be going backward from assuming AK's guilt at 50-50, whereas komponisto went forward from the background probability.

Comment author: gelisam 13 December 2009 05:56:35PM 1 point [-]

Everybody seemed to be going backward from assuming AK's guilt at 50-50, whereas komponisto went forward from the background probability.

I think I can see why. komponisto pretended to be a juror following the "innocent unless proven otherwise" mantra and updating on the evidence presented in the court. We, on the other hand, did what komponisto challenged us to do: figure out the answer to his riddle using the two websites he gave us. This being a riddle, not a court, we had no reason to favour one hypothesis over the other, hence the 50-50.

That being said, I did favour one hypothesis over the other (my stated priors were 75/75/25) because at the moment I paused to write down an approximation of my current beliefs, I had already updated on the evidence presented by komponisto himself in his post, namely, that there was a trial against AK and RS.

Maybe the reason why many of us gave so much importance to the fact that those particular individuals were on trial for murder was because it was our very first piece of information; and I don't think it's right for rationalists to do that.

Comment author: gelisam 10 December 2009 01:10:46AM *  10 points [-]

priors


Amanda Knox is guilty: 75%, Raffaele Sollecito: 75%, Rudy Guede: 25%. I shall abbreviate future percentage triples as 75/75/25.

No knowledge of the case before reading this post. My prior is due to my assumption that trial people know what they are doing, and on the fact that I imagined that the trial was trying to show that the guilty were K+S instead of G.

acquiring information


Reading about G's DNA, which should be rather good evidence: switching to 50/50/75. I contemplated switching all the way to 25/25/75, but I figured there had to be some reason for the new trial.

Reading about the police's claim that the murder was linked to a group sex game; thinking that this would be a ridiculous motive. This made me think that maybe the trial people didn't knew what they were doing after all. Switching to 25/25/80.

Finally realized that the trial was in fact trying to show that the guilty were K+S+G instead of just G, not K+S instead of G. Stopped keeping track of percentages for some reason.

Reading about the police switching from K+S+L to K+S+G, which lowered my esteem of the police even more.

Reading about the DNA of K+S, figured it was natural for a woman and her boyfriend to have DNA all over the woman's own house.

Still trying to understand who was G relative to the others. I think he's a robber now. Definitely not part of the group sex thing. Even worse feelings toward the police.

Over all, the truejustice website seems more emotional than the friends of K website, which surprises me. I would have expected the family of the victim to have calmed down after the original G trial, yet truejustice still seemed angry; and doesn't even seem to be ran by the victim's family at all. They should be a lot less emotional about this than K's friend, which seem to be a lot more clearheaded than truejustice is.

I'm now quite convinced of the innocence of K+S, although I'm too shy to give an actual percentage. 5/5/95, if not more extreme.

Comment author: gelisam 30 September 2009 02:03:20AM 0 points [-]

theoretically, you might both even have exactly the same evidence, but gathered in a different order. The question is one of differing interpretations, not raw data as such.

I happen to be studying conflicts in a completely different domain, in which I claim the solution is to ensure the events shape identical result no matter in which order they are applied. I briefly wondered whether my result could be useful in other domains, and I thought of lesswrong: perhaps we should advocate update strategies which don't depend on the order in which the evidence is encountered.

And then your post came up! Nice timing.

Comment author: gelisam 09 September 2009 04:01:12AM 1 point [-]

The reason we shouldn't update on the "room color" evidence has nothing to do with the fact that it constitutes anthropic evidence. The reason we shouldn't update is that we're told, albeit indirectly, that we shouldn't update (because if we do then some of our copies will update differently and we will be penalized for our disagreement).

In the real world, there is no incentive for all the copies of ourselves in all universes to agree, so it's all right to update on anthropic evidence.

View more: Prev