XiXiDu comments on A belief propagation graph - Less Wrong

8 Post author: Dmytry 10 May 2012 04:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (58)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 12 April 2012 11:39:07AM 0 points [-]

I do take some ideas very seriously. If we had a method of rationality for computationally bounded agents, it would surely do the same. Do you think I've taken the wrong ideas too seriously, or have spent too much time thinking about ideas generally? Why?

This comment of yours, among others, gave me the impression that you take ideas too seriously.

You wrote:

According to the article, the AGI was almost completed, and the main reason his effort failed was that the company ran out of money due to the bursting of the bubble. Together with the anthropic principle, this seems to imply that Ben is the person responsible for the stock market crash of 2000.

This is fascinating for sure. But if you have a lot of confidence in such reasoning then I believe you do take ideas too seriously.

I agree with the rest of your comment and recognize that my perception of you was probably flawed.

Comment author: Wei_Dai 12 April 2012 05:48:12PM 6 points [-]

Yeah, that was supposed to be a joke. I usually use smiley faces when I'm not being serious, but thought the effect of that one would be enhanced if I "kept a straight face". Sorry for the confusion!

Comment author: XiXiDu 12 April 2012 06:51:49PM *  1 point [-]

Yeah, that was supposed to be a joke.

I see, my bad. I so far believed to be usually pretty good at detecting when someone is joking. But given what I have encountered on Less Wrong in the past, including serious treatments and discussions of the subject, I thought you were actually meaning what you wrote there. Although now I am not so sure anymore if people were actually serious on those other occasions :-)

I am going to send you a PM with an example.

Under normal circumstances I would actually regard the following statements by Ben Goertzel as sarcasm:

Of course, this faith placed in me and my team by strangers was flattering. But I felt it was largely justified. We really did have a better idea about how to make computers think. We really did know how to predict the markets using the news.

or

We AI folk were talking so enthusiastically, even the businesspeople in the company were starting to get excited. This AI engine that had been absorbing so much time and money, now it was about to bear fruit and burst forth upon the world!

I guess what I encountered here messed up my judgement by going too far in suppressing the absurdity heuristic.

Comment author: Wei_Dai 12 April 2012 11:06:43PM 4 points [-]

But given what I have encountered on Less Wrong in the past, including serious treatments and discussions of the subject, I thought you were actually meaning what you wrote there.

The absurd part was supposed to be that Ben actually came close to building an AGI in 2000. I thought it would be obvious that I was making fun of him for being grossly overconfident.

BTW, I think some people around here do take ideas too seriously, and reports of nightmares probably weren't jokes. But then I probably take ideas more seriously than the average person, and I don't know on what grounds I can say that they take ideas too seriously, whereas I take them just seriously enough.

Comment author: steven0461 14 April 2012 12:46:00AM *  1 point [-]

some people around here do take ideas too seriously ... I don't know on what grounds I can say that

If you ever gain a better understanding of what the grounds are on which you're saying it, I'd definitely be interested. It seems to me that insofar as there are negative mental health consequences for people who take ideas seriously, these would be mitigated (and amplified, but more mitigated than amplified) if such people talked to each other more, which is however made more difficult by the risk that some XiXiDu type will latch onto something they say and cause damage by responding with hysteria.

One could construct a general argument of the form, "As soon as you can give me an argument why I shouldn't take ideas seriously, I can just include that argument in my list of ideas to take seriously". It's unlikely to be quite that simple for humans, but still worth stating.

Comment author: cousin_it 12 April 2012 12:10:26PM 4 points [-]

I'm pretty sure the bit about the stock market crash was a joke.

Comment author: wedrifid 12 April 2012 04:10:18PM 3 points [-]

To be fair I think Wei_Dai was being rather whimsical with respect to the anthropic tangent!