Comment author: Eliezer_Yudkowsky 23 July 2009 05:15:31PM 20 points [-]

I would still be better off just giving the beggar the money!

Increasing the payoff of begging just increases the number of beggars until the average wage once again falls to the point where it is just marginally unattractive - or so I've heard. I think someone once suggested that if you visit a poor country and want to help, do not under any circumstances give money to beggars; find someone who seems to be doing something productive, and give the money to them.

Comment author: Marcello 23 July 2009 06:10:28PM 7 points [-]

I recently visited Los Angeles with a friend. Whenever we got lost wandering around the city, he would find the nearest homeless person, ask them for directions and pay them a dollar. (Homeless people tend to know the street layout and bus routes of their city like the backs of their hands.)

Comment author: Roko 15 July 2009 05:16:29PM 1 point [-]

Not only do we have an absolute denial macro, but it is a programmable absolute denial macro and there are things much like computer viruses which use it and spread through human population. That is, if you modulated your voice in a certain way at someone, it would cause them (and you) to acquire a brand new self deception, and start transmitting it to others.

Yes, we have a name from this, Religion

Comment author: Marcello 16 July 2009 04:31:18AM 6 points [-]

Yes, we have a name from this, Religion

Agreed, but the fact that religion exists makes the prospect of similar things whose existence we are not aware of all the scarier. Imagine, for example, if there were something like a religion one of whose tenants is that you have to fool yourself into thinking that the religion doesn't exist most of the time.

Comment author: Marcello 15 July 2009 04:29:57PM 39 points [-]
  • We actually live in hyperspace: our universe really has four spacial dimensions. However, our bodies are fully four dimensional; we are not wafer thin slices a la flatland. We don't perceive there to be four dimensions because our visual cortexes have a defect somewhat like that of people who can't notice anything on the right side of their visual field.

  • Not only do we have an absolute denial macro, but it is a programmable absolute denial macro and there are things much like computer viruses which use it and spread through human population. That is, if you modulated your voice in a certain way at someone, it would cause them (and you) to acquire a brand new self deception, and start transmitting it to others.

  • Some of the people you believe are dead are actually alive, but no matter how hard they try to get other people to notice them, their actions are immediately forgotten and any changes caused by those actions are rationalized away.

  • There are transparent contradictions inherent in all current mathematical systems for reasoning about real numbers, but no human mathematician/physicist can notice them because they rely heavily on visuospacial reasoning to construct real analysis proofs.

Comment author: Psychohistorian 03 July 2009 12:05:04AM *  0 points [-]

I'm not sure this really counts as a bias. It seems quite rational, unless you will actually suffer immediate and significant consequences if you are wrong about string theory.

The cost of privately held beliefs (especially about abstract truths) is quite low. If I believe the earth is a flat disk on the back of an infinite stack of tortoises, and if I'm, say, a car mechanic, I will not suffer at all for this belief. Unless of course, I mention it to my friends, because they will judge me for it, unless of course they all believe the same thing, in which case I'll probably proclaim this belief loudly and often and possibly meet up with them to discuss it on an appointed day of the week. I may suffer because I fail at epistemology, but it doesn't seem clear how trusting the wrong source on one marginal occasion will corrupt my epistemology (doubly so if I'm refined enough to have a concept of my own epistemology). Taking epistemology as exogenous, there's really no cost to a marginal false belief (that does not affect me directly).

Having a false belief about some fact that has no direct bearing on your life is way, way, way cheaper than publicly expressing belief in such a fact and being refuted. There seems to be nothing irrational about investing more energy fact-checking in the latter scenario.

Edit: Two things.

First, the turtles example was poorly chosen, as it blurs the line between belief and epistemology too much. Better examples would include, say, wrongly believing celebrity gossip, or having incorrect beliefs about unpractical science due to a lack of information or a misunderstanding of alternatives. If the car mechanic believed Newton was totally right (because he hadn't seen evidence to the contrary), this would be a very, very low cost false belief. Interestingly, "Barack Obama is a Muslim" probably falls under this category, though it blurs the epistemological line a bit more.

Second, it's also quite possible that you care more about misleading others than you do being wrong. It's easy enough to change your mind if you see contradictory evidence or reread the article and understand it better. It's rather harder to change other people's minds who have been convinced, and you'll feel like you've let them down as an authority, since they trusted you.

Comment author: Marcello 03 July 2009 12:56:54AM 4 points [-]

I'm not sure the cost of privately held false beliefs is as low as you think it is. The universe is heavily Causally Entangled. Now even if in your example, the shape of the earth isn't causally entangled with anything our mechanic cares about, that doesn't get you off the hook. A false belief can shoot you in the foot in at least two ways. First, you might explicitly use it to reason about the value of some other variable in your causal graph. Second, your intuition might draw on it as an analogy when you are reasoning about something else.

If our car mechanic thinks his planet is a disc supported atop an infinite pile of turtles, when this is in fact not the case, then isn't he more likely to conclude that other things which he may actually come into more interaction with (such as a complex device embedded inside a car which could be understood by our mechanic, if he took it apart and then took the pieces apart about five times) might also be "turtles all the way down"? If I actually lived on a disc on top of infinitely many turtles, then I would be nowhere near as reluctant to conclude that I had a genuine fractal device on my hands. If I actually lived in a world which was turtles all the way down, I would also be much more disturbed by paradoxes involving backward supertasks.

To sum up: False beliefs don't contaminate your belief pool via the real links in the causal network in reality; they contaminate your belief pool via the associations in your mind.

Comment author: Eliezer_Yudkowsky 02 July 2009 10:09:35PM 11 points [-]

All truth is not, indeed, of equal importance; but if little violations are allowed, every violation will in time be thought little.

-- Samuel Johnson

Comment author: Marcello 02 July 2009 11:14:31PM 14 points [-]

Anyone who doesn't take truth seriously in small matters cannot be trusted in large ones either.

-- Albert Einstein

Comment author: Marcello 02 July 2009 10:16:22PM 22 points [-]

Those who can make you believe absurdities can make you commit atrocities.

-- Voltaire

Comment author: PhilGoetz 12 March 2009 06:06:49AM 5 points [-]

Err... I actually toss around endorsements of "spirituality" in those contexts where doing so seems likely to have positive effects. Naive realism is a supernatural belief system anyway, just a more subtle than average one. I'll invoke Einstein, Hume and Spinoza as precedents if you wish. Who do you think, by the way, is more likely to convince a theist to sign up for cryonics, a person who says "god is a stupid idea, this is the only way to survive death" or a person who says "I believe in god too, but I also believe in taking advantage of the best available medical technologies". I'd accept a double blind study showing that the former worked better, but it's not how I'd bet.
More importantly, I think that the canary function is more valuable than any harm caused by moderate Christianity, especially if combined with a possible vaccine function. Also, Sam Harris DOES talk about spirituality, and Dennett about free will. Finally, for what it's worth, we only have one data point for a scientific civilization rising, and it was in the religious West not the relatively secular China. Weak evidence, but still evidence.

Comment author: Marcello 12 March 2009 07:43:27AM 8 points [-]

Incidentally, I agree that using the term "spirituality" is not necessarily bad. Though, I'm careful to try to use it to refer to the general emotion of awe/wonder/curiosity about the universe. To me the word means something quite opposed to religion. I mean the emotion I felt years ago when I watched Carl Sagan's "Cosmos".... To me religion looks like what happens when spirituality is snuffed out by an answer which isn't as wonderfully strange and satisfyingly true as it could have been.

It's a word with positive connotations, and we might want to steal it. It would certainly help counteract the vulcan stereotype.

Comment author: PhilGoetz 12 March 2009 06:06:49AM 5 points [-]

Err... I actually toss around endorsements of "spirituality" in those contexts where doing so seems likely to have positive effects. Naive realism is a supernatural belief system anyway, just a more subtle than average one. I'll invoke Einstein, Hume and Spinoza as precedents if you wish. Who do you think, by the way, is more likely to convince a theist to sign up for cryonics, a person who says "god is a stupid idea, this is the only way to survive death" or a person who says "I believe in god too, but I also believe in taking advantage of the best available medical technologies". I'd accept a double blind study showing that the former worked better, but it's not how I'd bet.
More importantly, I think that the canary function is more valuable than any harm caused by moderate Christianity, especially if combined with a possible vaccine function. Also, Sam Harris DOES talk about spirituality, and Dennett about free will. Finally, for what it's worth, we only have one data point for a scientific civilization rising, and it was in the religious West not the relatively secular China. Weak evidence, but still evidence.

Comment author: Marcello 12 March 2009 07:10:30AM 9 points [-]

Michael Vassar said:

Naive realism is a supernatural belief system anyway

What exactly do you mean by "supernatural" in this context? Naive realism doesn't seem to be anthropomorphizing any ontologically fundamental things, which is what I mean when I say "supernatural".

Now of course naive realism does make the assumption that certain assumptions about reality which are encoded in our brains from the get go are right, or at least probably right, in short, that we have an epistemic gift. However, that can't be what you meant by "supernatural", because any theory that doesn't make that assumption gives us no way to deduce anything at all about reality.

Now, granted, some interpretations of naive realism may wrongly posit some portion of the gift to be true, when in fact, by means of evidence plus other parts of the gift, we end up pretty sure that it's wrong. But I don't think this sort of wrongness makes an idea supernatural. Believing that Newtonian physics is absolutely true, regardless of how fast objects move is a wrong belief, but I wouldn't call it a supernatural belief.

So, what exactly did you mean?

Comment author: Baughn 11 March 2009 01:42:13PM *  2 points [-]

That's a bug of sorts in the script, and easily fixable. In fact, I've already done so; have an updated version.

Comment author: Marcello 11 March 2009 07:06:27PM 1 point [-]

Your version is now the official version 0.3. However, the one thing you changed (possibly unintentionally) was to make Kibitzing default to on. I changed it back to defaulting to off, because it's easy to click the button if you're curious, but impossible to un-see who wrote all the comments, if you didn't want to look.

Comment author: Eliezer_Yudkowsky 09 March 2009 09:10:28PM 1 point [-]

Well, to kibitz the anti-kibitzing, it looks to me like:

by <a href="http://lesswrong.com/user/Marcello"><strong>Marcello</strong></a>

would match pretty easily against something that looked for

by <a href="http://lesswrong.com/user/(.+)"><strong>(.+)</strong></a>

and deleted it, similarly on the Recent Posts but without the <strong>, checking for the identity of the two matched strings is optional (I forget how to do this offhand with REs).

Comment author: Marcello 09 March 2009 09:46:37PM 3 points [-]

That particular hack looks like a bad idea. What if somebody actually put a bold-face link into a post or comment? However, your original suggestion wan't as bad. All non-relative links to user pages get blocked by the anti-kibitzer. (Links in "Top contributors" and stuff in comments seem to be turned into relative links if they point inside LW.) It's gross, but it works.

Version 0.2 is now up. It hides everything except the point-counts on the recent posts (there was no tag around those.) (Incidentally, I don't have regular expressions because by the time my script gets its hands on the data, it's not a string at all, but a DOM-tree. So, you'd have to specify it in XPath.)

I think trying to do any more at this point would be pointless. Most of the effort involved in getting something like this to be perfect would be gruesome reverse engineering, which would all break the minute the site maintainers change something. The right thing to do(TM) would be to get the people at Tricycle to implement the feature (I hereby put the code I wrote into the public domain, yada yada.) Then we don't have to worry about having to detect which part of the page something belongs to because the server actually knows.

View more: Prev | Next