Comment author: Vaniver 02 December 2013 08:12:55PM *  8 points [-]

MORPHEUS: You take the blue pill and the story ends. You wake in your bed and you believe whatever you want to believe.

MORPHEUS: You take the red pill and you stay in Wonderland and I show you how deep the rabbit-hole goes.

From The Matrix Original Script (the wording is slightly different in the movie).

Comment author: Lethalmud 06 December 2013 03:40:25PM 16 points [-]

As a side note, never take pills from strange people in empty werehouses who found you on the internet.

Comment author: Shield 28 September 2013 11:22:18AM 6 points [-]

Are you sure that "anti placebo effect" is a good name though? The placebo effect refers exclusively to medical treatment if I'm not entirely mistaken, and this seems to have much broader implications in basically any sort of training. It's still basically the same effect if someone refuses to notice the progress they made with say tutoring, but it has nothing to do with medicine or treatment.

Seems a bit misleading.

Comment author: Lethalmud 30 September 2013 12:33:17PM 0 points [-]
In response to comment by ciphergoth on Where are we?
Comment author: steven0461 03 April 2009 12:21:10PM 0 points [-]

Post in this thread if you live in the Netherlands.

In response to comment by steven0461 on Where are we?
Comment author: Lethalmud 04 July 2013 03:08:35PM 0 points [-]

Present!

Comment author: Eliezer_Yudkowsky 14 September 2007 11:39:31PM 8 points [-]

Um, there are readers of this blog, and there are people who enjoy the "happiness of stupidity" (which is not the same as just having a low IQ; it involves other personality traits as well). I don't think there's much overlap between those two groups. But they are far from being the only two groups in the world, and there is no dichotomy between them.

Comment author: Lethalmud 04 July 2013 10:34:57AM 2 points [-]

This is interesting. When I first discovered LW, I was reading The Praise of Folly by Erasmus. He argues, among other things, that all emotions and feelings that make life worthwhile are inherently imbedded in stupidity. Love, friendship optimism and happiness require foolishness to work. Now is it very hard to compare a sixteenth century satirical piece with a current rational argument, but I have observed that intelligence and stupidity don't seem to be mutually exclusive. From where comes your assumption that intelligent, rational people can't be stupid? Emotions don't tend to be rational, and in the force of a strong one like love even the most intelligent and rational person can turn into an optimistic fool, sure that their loved one is infinitely more trustworthy than the average human, and statistics on adultery don't apply in this case. Should you try to overcome the bias of strong emotions? Can you overcome it at all? I have never seen someone immune to it. So maybe the happiness of stupidity is still available to all of us.

In response to Changing Emotions
Comment author: Doug_S. 05 January 2009 06:09:38AM 6 points [-]

frelkins: Well, Ranma isn't Tiresias. The Ranma 1/2 manga was written by a woman, if that changes anything.

Here's a little bit of silliness. Inquest Gamer magazine once ran a poll asking people to choose between various (silly) options of which horrible fate they would prefer to endure. One was a choice between "Randomly change the Magic rules each time you create a killer deck" and "Randomly change your gender each time you go to sleep." "Gender" won by a large margin.

In response to comment by Doug_S. on Changing Emotions
Comment author: Lethalmud 28 June 2013 08:46:58AM -1 points [-]

That is an awfull fate. RIP mana burn deck..

Comment author: Eliezer_Yudkowsky 02 September 2007 11:01:30PM 9 points [-]

I say "must" in the Worship option. It is irony.

But if there is an infinite regress of causality, I should find that highly curious, and would like to hear Explained why it is allowed, and why this infinite regress exists rather than some other one.

Comment author: Lethalmud 27 June 2013 11:05:20AM 0 points [-]

I don't understand why you assign a lower probability to the possibility of an infinite regress of causality, than to the possibility of a non casual event or casual loop.

Comment author: Will_Newsome 21 June 2012 10:17:15PM 8 points [-]

Furthermore at least one person I know (er, myself) picks up on any sort of test-like or game-like or we're-judging-you-so-you-better-not-screw-up-like context and starts acting in extremely confusing/uninformative/atypical/misleading ways so as not to be seen as the kind of person who is easily manipulable (there are probably other motivations involved too). Any incentive structure I'm put under thus has to somehow take this into account, even e.g. the LessWrong karma system. Explicitly manipulative socially mediated praise/M&Ms would strike my brain as outright evil and would stand some chance of being inverted entirely. That said I don't get the impression this sort of defense mechanism is very common.

Comment author: Lethalmud 26 June 2013 02:45:51PM *  4 points [-]

So you are saying that, to change your mode of behavior, all one has to do is create a judging context? That would actually make you very easy to manipulate..

Comment author: Luke_A_Somers 18 June 2013 04:56:41PM 3 points [-]

I think that it's mainly that time travel plots seem a lot harder to write than they are.

Comment author: Lethalmud 26 June 2013 09:06:31AM 6 points [-]

Or, that because most of time travel in popular media make no sense whatsoever, people assume it must be very difficult.

In response to Infinite Certainty
Comment author: Paul_Gowder 09 January 2008 08:19:15AM 4 points [-]

If you get past that one, I'll offer you another.

"There is some entity [even if only a simulation] that is having this thought." Surely you have a probability of 1 in that. Or you're going to have to answer to Descartes's upload, yo.

Comment author: Lethalmud 25 June 2013 12:54:51PM 0 points [-]

Well, maybe you fell asleep halfway that thought, and thought the last half after you woke, without noticing you slept.

Comment author: Eliezer_Yudkowsky 28 September 2007 12:38:57AM 29 points [-]

At any rate, if the former is true, 2+2=4 is outside the province of empirical science, and applying empirical reasoning to evaluate its 'truth' is wrong.

When I imagine putting two apples next to two apples, I can predict what will actually happen when I put two earplugs next to two earplugs, and indeed, my mind can store the result in a generalized fashion which makes predictions in many specific instances. If you do not call this useful abstract belief "2 + 2 = 4", I should like to know what you call it. If the belief is outside the province of empirical science, I would like to know why it makes such good predictions.

To apply the same reasoning the other way, if you aren't a Christian, what would be a situation which would convince you of the truth of Christianity?

You'd have to fix all the problems in belief, one by one, by reversing the evidence that originally convinced me of the beliefs' negations. If the Sun stopped in the sky for a day, and then Earth's rotation restarted without apparent damage, that would convince me there was one heck of a powerful entity in the neighborhood. It wouldn't show the entity was God, which would be much more complicated, but it's an example of how one small piece of my model could be flipped from the negation of Christianity (in that facet) to the non-negation.

Getting all the pieces of the factual model (including the parts I was previously convinced were logically self-contradictory) to align with Christianity's factual model, would still leave all the ethical problems. So the actual end result would be to convince me that the universe was in the hands of a monstrously insane and vicious God. But then there does not need to be any observable situation which convinces me that it is morally acceptable to murder the first-born children of Egyptians - morality does not come from environmental entanglement.

Comment author: Lethalmud 11 June 2013 01:42:18PM *  0 points [-]

Elezier, do you believe that someday humans could create an AI and put that AI in a simulated enviroment that accurately simulated all the observations humanity made until now?

If you do, what further observations would that AI have to make to arrive at the belief that they were created by an intelligent entity?

View more: Prev | Next