Comment author: Eliezer_Yudkowsky 20 August 2010 03:24:38AM 26 points [-]

Er... I can't help but notice a certain humor in the idea that it's terrible if I'm self-deluded about my own importance because that means I might destroy the world.

Comment author: John_Baez 20 August 2010 11:02:50AM 5 points [-]

It's some sort of mutant version of "just because you're paranoid doesn't mean they're not out to get you".

Comment author: cousin_it 17 August 2010 07:55:41PM *  12 points [-]

That was... surprisingly surprising. Thank you.

For reasons like those you listed, and also out of some unverbalized frustration, in the last week I've been thinking pretty seriously whether I should leave LW and start hanging out somewhere else online. I'm not really interested in the Singularity, existential risks, cognitive biases, cryonics, un/Friendly AI, quantum physics or even decision theory. But I do like the quality of discussions here sometimes, and the mathematical interests of LW overlap a little with mine: people around here enjoy game theory and computability theory, though sadly not nearly as much as I do.

What other places on the Net are there for someone like me? Hacker News and Reddit look like dumbed-down versions of LW, so let's not talk about those. I solved a good bit of Project Euler once, the place is tremendously enjoyable but quite narrow-focused. The n-Category Cafe is, sadly, coming to a halt. Math Overflow looks wonderful and this question by Scott Aaronson nearly convinced me to drop everything and move there permanently. The Polymath blog is another fascinating place that is so high above LW that I feel completely underqualified to join. Unfortunately, none of these are really conducive to posting new results, and moving into academia IRL is not something I'd like to do (I've been there, thanks).

Any other links? Any advice? And please, please, nobody take this comment as a denigration of LW or a foot-stomping threat. I love you all.

Comment author: John_Baez 19 August 2010 07:58:44AM *  15 points [-]

My new blog "Azimuth" may not be mathy enough for you, but if you like the n-Category Cafe, it's possible you may like this one too. It's more focused on technology, environmental issues, and the future. Someday soon you'll see an interview with Eliezer! And at some point we'll probably get into decision theory as applied to real-world problems. We haven't yet.

(I don't think the n-Category Cafe is "coming to a halt", just slowing down - my change in interests means I'm posting a lot less there, and Urs Schreiber is spending most of his time developing the nLab.)

Comment author: Sebastian_Hagen 15 March 2009 05:01:40PM *  2 points [-]

When you meet a god, how can you be sure it's not a hallucination?

Assuming the entity in question is cooperative, try this:

Ask it if P=NP is true, and for a proof for its answer to that in a form that you can easily understand. There's three possible outcomes:

  • It doesn't comply. Time to get suspicious about its claims to godhood.
  • It hands you a correct proof, beautifully elegant and easy to grasp.
  • It hands you a lump of nonsense, which your mind is too damaged to distinguish from a proof.

If you get something that appears like an elegant proof, memorize it and recheck it every now and then. If your mind is sufficiently malfunctioning that it can't distinguish an elegant proof for P=NP from something that isn't, you may not be able to notice that from inside. There's still a chance whatever is afflicting you will get better over time; hence, do periodic rechecks, and pay particular attention to any nagging doubts about the proof you get while performing those.

In the meantime, interpret the fact that you've gotten an apparent proof as significant evidence for the entity in question being real and very powerful.

Comment author: John_Baez 03 May 2010 12:03:38AM 6 points [-]

Or: it says "This is undecidable in Zermelo-Fraenkel set theory plus the axiom of choice". In the case of P=NP, I might believe it

Ask again, with another famously unsolved math problem. Repeat until it stops saying that or you run out of problems you know.

I would not believe a purported god if it said all 9 remaining Clay math prize problems are undecidable.

Comment author: Aurini 19 March 2009 05:06:10AM 0 points [-]

I apologize for banging on about the railroad question, but I think the way you phrased it does an excellent job of illustrating (and has helped me isolate) why I've always vaguely uncomfortable with Utilitarianism. There is a sharp moral contrast which the question doesn't innately recognize between the patients entering into a voluntary lottery, and the forced-sacrifice of the wandering traveller.

Unbridled Utilitarianism, taken to the extreme, would mandate some form of forced Socialism. I think it was you who commented on OvercomingBias, that one of the risks associated with Cryogenics is waking up in a society where you are not permitted to auto-euthanize. Utilitarianism might argue that the utility of your own diminished suffering would be less than the utility of others people valuing your continued life.

While Utilitarianism is excellent for considering consequences, I think it's a mistake to try and raise it as a moral principle. I lean towards a somewhat Objectivist viewpoint: namely, that the first principle we ought to start with is that each person has the right to their own person and property, and that it is immoral to try and take it from them for any cause.

Following from this, let me address your third question: I'd argue that this type of wealth transfer not only undermines long-term economic develop of the African country (empirical, I could be proved wrong), not only prevents me from spending money on quality products & investing in practical businesses (once again, empirical), but that on a deeper level it undermines the individuality which I value in the human condition. Askin which produces greater happiness & material wealth, Communism or Capitalism, is an empirical question: Omega could come down and tell me that Communism will produce 10x the happiness, or 100x, or whatever. But the idea of slamming everybody into the same, mass produced box to maximize happiness utility sounds suspiciously like Orgasmium.

I don't see how you can compromise on these principles. Either each person has full ownership of themselves (so long as they don't infringe on others), or they have zero ownership. Morality (as I would define it) demands that we fight to protect others freedom, but it says nothing about ensuring their welfare. Giving something for 'free' is just another form of enslavement - even if it's only survival and dependence in exchange for a smug sense of superiority.

On a side note, you did a brilliant job of deconstructing 'morality based on empiricism.'

Comment author: John_Baez 02 May 2010 11:59:27PM 16 points [-]

Unbridled Utilitarianism, taken to the extreme, would mandate some form of forced Socialism.

So maybe some form of forced socialism is right. But you don't seem interested in considering that possibility. Why not?

While Utilitarianism is excellent for considering consequences, I think it's a mistake to try and raise it as a moral principle.

Why not?

It seems like you have some pre-established moral principles which you are using in your arguments against utilitarianism. Right?

I don't see how you can compromise on these principles. Either each person has full ownership of themselves (so long as they don't infringe on others), or they have zero ownership.

To me it seems that most people making difficult moral decisions make complicated compromises between competing principles.

View more: Prev