Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Matthew2 01 December 2008 05:47:21AM 0 points [-]

Wow, yes of course thinking the future of intelligence can't go foom would be bad. Since your non-profit plans on doing just that and it's really dangerous. I get that, except the foom part.

I don't see how you could argue this without a better understanding of how the brain is so smart.

All right, so it's your job to be extremely abstract. Well, abstract away. :P

Comment author: Matthew2 28 October 2008 12:34:00AM 0 points [-]

I would really enjoy this post more if it were in the context of cognitive neuroscience. Or at least some phenomena actually extracted from the brain. For example, how could we detect a difference in intelligence biologically? Could this inspire a different kind of intelligence measure?

Comment author: Matthew2 11 September 2008 04:03:08AM 0 points [-]

The archive version still exists here:


Thanks for all of your hard work. I'll be spending many years sifting through it later I am sure.

Comment author: Matthew2 08 June 2008 07:19:52AM 0 points [-]

He wasted 90% of the interview because Yudkowsky discussed how to be rational rather than answering implications of AGI being possible.

How does Yudkowsky's authority change our viewpoint of the feasibility of AGI being developed quickly when most experts clearly disagree? We need to go from the elders being wrong in technique to the path to AGI.

And what about the claim that a billion dollar project isn't needed? Singinst thinks they can do it alone, with a modest budget of a few millionaires? Isn't this a political position?

I am glad Yudkowsky is trying so hard but it seems he is doing more politics and philosophy than research. Perhaps in the long term this will be more effective, as the goal is to win, not to be right.

Comment author: Matthew2 28 February 2008 07:02:55AM 0 points [-]

the ep. dealmaker said: "I would be interested in seeing you talk about belief and probability in cases where the deck is not quite so stacked as it is in your thermodynamic examples."

It seems reasonable for a financial analyst to understand that the lottery and coin flipping aren't "stacked".

Hmm, what else could he mean...

Perhaps he means something like the weather.

Comment author: Matthew2 03 February 2008 05:59:26PM 0 points [-]

You mean by those last two lines that logic offer's no 'grounding' to reality and only empirical probability does? Since truth does not depend upon us, what does it depend upon? Well, the truth depends on circumstance if utilizing probability theory and empiricism. Since their is no absolute way of knowing their is also no absolute way of knowing how unlucky or lucky our circumstances are in favoring truth.

Sure, reality is non-dependent. But the nature of our circumstances are very dependent...upon that which we cannot measure. Our position in the universe.

And caledonian, if you were arguing with a theist you would have lost by now.

In response to Circular Altruism
Comment author: Matthew2 23 January 2008 04:54:16AM -1 points [-]

Your conclusion follows very clearly from the research results but it does not apply to the new situation. Doing the math is a false premise. Few people have personal experience of being tortured and more importantly no one who disagrees with you understands what you personally mean by the dust-speck. Perhaps if it was sawdust or getting pool water splashed in your eye, then it would finally register more clearly. Again, you (probably) haven't been tortured but you have gone through life without even conciously registering a dust speck in your eye. With a little adjustment above a threshold many people might switch sides. Pain is not linear.

Comment author: Matthew2 11 January 2008 03:28:31AM 0 points [-]

Perhaps you could clarify what exactly is an infinite set atheist in a full post...or maybe it's only worth a comment.

Comment author: Matthew2 20 December 2007 09:05:00AM 0 points [-]

caledonian said: "Perhaps the possibility that a consequence of an entity being utterly good might be its being utterly unsafe has never occurred to them."

This describes monotheism rather well. It has occured to me.

Comment author: Matthew2 13 December 2007 09:42:22PM 0 points [-]

@yudkowsky I would be happy if I could judge the merit of Bayes for myself versus the frequentists approach. I doubt UTD faculty have seen the light, but who knows, they might. I wonder even more deeply if a thorough understanding of Bayes gives any insight into Epistemology? If you can answer Bayes does offer insight into epistemology I know for sure I will be around for many more months. If I remember correctly, we both have the same IQ (140) yet I am much worse at mathematics. OF course, my dad is an a/c technician, not a physicist.

I enjoy your hard work and insights Eliezer. Also Caledonians comments, mainly for their mystery.

View more: Next