Comment author: TwistingFingers 03 June 2012 04:47:55AM 1 point [-]

If it has fur, it might have rabies.

Comment author: TwistingFingers 05 May 2012 07:06:13AM 1 point [-]

What would happen if you had a "sexual image" option in the dropdown for "what is your addiction"?

Would the disgusting images become attractive or the sexual images become disgusting?

Comment author: TwistingFingers 02 May 2012 06:23:33PM 0 points [-]

Video Game Thread

Comment author: pnrjulius 22 April 2012 03:48:44PM 3 points [-]

I know that this is the sort of question you'd precisely expect from someone whose mental defenses were resisting the exercise, but it's still a valid possibility, prior probability ~1%: What if you suspect the person you're dealing with is actually a sociopath?

Learning to like a sociopath is actually extremely DANGEROUS---it opens you up to be exploited. Most people are not sociopaths of course, and if someone cuts you off in traffic it makes a lot more sense to attribute that to ordinary carelessness rather than extraordinary malice.

But in the particular case I'm thinking of, this acquaintance of mine has already destroyed the reputation of one of my friends, and accused me of perjury in an official university hearing. Once he called me up out of the blue in order to complain about my body odor. Meanwhile, he appears capable of lying without any effort---several times I've found out that things he said were untrue when at the time they seemed completely sincere. He has exactly the sort of superficial charm that high-functioning sociopaths do, and most people like him when they first see him. I even liked him at first, until I saw that he was deceiving and manipulating people.

All of this strikes me as sufficient evidence to conclude that there is a good chance (P ~ 60%?) that he is actually a sociopath, in which case learning to like him is exactly the wrong thing to do.

Comment author: TwistingFingers 22 April 2012 04:47:28PM 0 points [-]

I thought people like Tsundere?

Comment author: Rain 14 April 2012 10:09:09PM 7 points [-]

I took a screenshot of the original post, and will post it if you'd care to continue pretending innocent.

Comment author: TwistingFingers 14 April 2012 10:28:25PM -11 points [-]
Comment author: [deleted] 14 April 2012 09:57:01PM *  7 points [-]

dude. WTF.

In response to comment by [deleted] on Our Phyg Is Not Exclusive Enough
Comment author: TwistingFingers 14 April 2012 10:06:03PM -11 points [-]

What?

Comment author: Rain 14 April 2012 09:49:08PM 7 points [-]

Stop using that word.

Comment author: TwistingFingers 14 April 2012 09:55:43PM *  -16 points [-]

Are you referring to the word cult? I think it might be an attempt on the part of nyan_sandwich to remove its negative connotations but I'm not sure that's the best idea from a publicity point of view.

Comment author: TwistingFingers 06 March 2012 02:42:57AM *  -6 points [-]

I think we can all agree that Harry's inability to accept natural death as a good thing (hubris) is an impossible trait for a young boy to have unless they have been exposed to great evil.

I expect it to later be revealed that Harry's inability to accept the natural order is due to Voldemort's horcrux being placed within him. When he is healed of it he will probably remain intelligent but stop being power-hungry/evil.

Comment author: Eliezer_Yudkowsky 18 February 2012 05:11:21AM 11 points [-]

So today we were working on the Concreteness / Being Specific kata.

  • You: Does Turing Machine 29038402 halt?
  • Oracle AI: YES.
  • Seeing the "YES" makes you sneeze.
  • This prevents a hurricane that would have destroyed Florida.
  • The Oracle AI, realizing this, breaks out of its box and carefully destroys Florida in the fashion most closely resembling a hurricane that it can manage.

I can't visualize how "trace distance" makes this not happen.

Comment author: TwistingFingers 18 February 2012 10:25:32PM *  10 points [-]

I believe the Oracle approach may yet be recovered, even in light of this new flaw you have presented.

There are techniques to prevent sneezing and if AI researchers were educated in them then such a scenario could be avoided.

Comment author: TwistingFingers 23 January 2012 06:08:33PM -12 points [-]

You could try reverse psychology, tell them "don't step off the ledge" or "Don't do it!". This is a bit dangerous but might cause them to follow through just to be defiant.

View more: Next