Comment author: Eliezer_Yudkowsky 02 January 2011 07:50:37PM 13 points [-]

General AI will not be made in 2011. Confidence: 90%.

I hope to hell you're underconfident about that.

Comment author: Matt_Stevenson 04 January 2011 08:58:43AM 2 points [-]

Would you classify MC-AIXI as a General AI?

Comment author: Perplexed 23 October 2010 07:48:59PM 5 points [-]

But physicists don't ignore friction when performing experiments, they do so only in teaching. If philosophers used trolley problems only to teach ethics ("Push one fat philosopher onto the tracks, to save two drug addicts.") or to teach metaethics ("An adherent of virtue ethics probably wouldn't push") then I doubt that lionhearted would complain.

But we have psychologists using trolley problems to perform experiments (or, if from Harvard, to publish papers in which they claim to have conducted experiments). That is what I understand lionhearted to be objecting to.

Comment author: Matt_Stevenson 23 October 2010 10:54:42PM 0 points [-]

I think a better example than frictionless surfaces and no air resistance would be idealized symmetries. Once something like Coulomb's Law was postulated physicists would imagine the implications of charges on infinite wires and planes to make interesting predictions.

We use the trolley problem and its variations as thought experiments in order to make predictions we can test further with MRIs and the like.

So a publication on interesting trolley problem results would be like theoretical physics paper showing relativity predicts some property of black holes.

Comment author: lionhearted 23 October 2010 06:53:30AM *  2 points [-]

I think you are looking at the Trolley Problem out of context.

I understand the supposed purpose of trolley problems, but I think they're conducive to poor quality thinking none the less.

They don't offer solutions to moral questions, they highlight the problems.

Right, but I think there's better ways of going about it. I wanted to keep the post brief and information-dense so I didn't list alternative problems, but there's a number you could use based in real history. For instance, a city is about to be in lost in war, and the military commander is going through his options - do you order some men to stay behind and fight to the death to cover the retreat of the others, ask for volunteers to do it, draw lots? Try to have everyone retreat, even though you think there's a larger chance your whole force could be destroyed? If some defenders stay, does the commander lead the defensive sacrificing force himself or lead the retreat? Etc, etc.

That sort of example would include imperfect information, secondary effects, human nature, and many different options. I think trolley problems are constructed so poorly that they're conducive to poor quality thought. There's plenty of examples you could use to discuss hard choices that don't suffer from those problems.

Comment author: Matt_Stevenson 23 October 2010 07:22:36AM *  9 points [-]

I would compare the trolley problem to a hypothetical physics problem. Just like a physicist will assume a frictionless surface and no air resistance, the trolley problem is important because it discards everything else. It is a reductionist attempt at exploring moral thought.

Comment author: Matt_Stevenson 23 October 2010 06:30:22AM 8 points [-]

I think you are looking at the Trolley Problem out of context.

The Trolley Problem isn't suppose to represent a real-world situation. Its a simplified thought experiment designed to illustrate the variability of morality in slightly differing scenarios. They don't offer solutions to moral questions, they highlight the problems.

Comment author: Vladimir_Nesov 08 October 2010 10:26:44AM 6 points [-]

He refuses to teach the advanced kind, and expects she can't perform the regular kind in any case (which she tried). Draco couldn't learn regular Patronus because, being a Slytherin, he never made a honest effort.

Comment author: Matt_Stevenson 09 October 2010 12:57:08AM 0 points [-]

Didn't Harry also swear to keep what he and Draco experiment with secret? This is why he never told her about the magic gene either, unless I am misremembering things.

Comment author: knb 08 October 2010 12:18:43AM *  1 point [-]

What was the Interdict of Merlin again? I googled but none of the links were defining, just referencing.

If Vold. is in his head, it isn't visible, Quirrel no longer wears the turban he used to hide it in canon.

Comment author: Matt_Stevenson 08 October 2010 04:58:11AM 4 points [-]

From Ch. 23

There's something called the Interdict of Merlin which stops anyone from getting knowledge of powerful spells out of books, even if you find and read a powerful wizard's notes they won't make sense to you, it has to go from one living mind to another

Comment author: cousin_it 21 April 2010 09:25:17AM *  30 points [-]

What Thomas Schelling would do. Partly tongue-in-cheek.

The Clumsy Game-Player: agree to the deal, then perform an identical "finger slip" several turns later.

The Lazy Student, The Grieving Student, The Sports Fan: make the deadline for reports a curve instead of a cliff. Each day of delay costs some percentage of the grade.

The Murderous Husband: if you really don't want these things to happen, make the wife partially responsible for the murder in such cases, by law. (Or the lover, if the husband chooses to murder the wife.)

The Bellicose Dictator: publicly threaten sanctions unless the invading army withdraws immediately. Do this before any negotiations.

The Peyote-Popping Native, The Well-Disguised Atheist: when the native first comes to you, offer to balance out the permission to smoke peyote with some sanction against the Native American church. Then the atheists won't bother asking for a free lunch.

Comment author: Matt_Stevenson 22 April 2010 11:54:38PM *  5 points [-]

The Lazy Student, The Grieving Student, The Sports Fan: make the deadline for reports a curve instead of a cliff. Each day of delay costs some percentage of the grade.

I've always liked the "drop the n lowest scores" strategy. For example, 10 assignments given with the lowest 2 scores ignored.

You are pre-committing to a set of rules, where any excuse would have a much lower probability of being true. Any excuse would need to include 3 excuses. Combining the probabilities of each of the excuses will likely bring the total under your acceptable threshold. Basically, it's lowering the likelihood that you will want to violate the rules.

You can also look at like this. Your model of people predicts that they are scoundrels, and will try to violate the rules, maximizing their utility at your expense. So build a system where procrastinators can maximize their utility at no expense to you.

Comment author: Matt_Stevenson 20 April 2010 01:46:50AM *  6 points [-]

Hi, I'm Matt Stevenson. 24 yr old computer scientist. I work on AI, machine learning, and motor control at a small robotics company.

I was hooked when I read Eliezer on OvercomingBias posting about AGI/Friendly AI/Singularity/etc...

I'd like to comment (or post) more, but I would need to revisit a few of the older posts on decision theory to feel like I'm making an actual contribution (as opposed to guessing the karma password). A few more hours in the day would be helpful.

Comment author: FAWS 11 March 2010 02:56:59AM *  8 points [-]

It's not obvious to me that your friend was actually lying in the strict sense. He may have made that statement because it "felt right" without really considering its truth. He might have heard a lot of news about executions in the USA, but proportionally much less about executions in China, on an emotional level based the opinion that there are more executions in the USA on that without ever actually thinking it through, and then in the heat of the debate that statement just sort of slipped out of him. Of course I don't know whether anything like this was the case, but it strikes me as more likely than blatant lying.

I assume you haven't actually asked him why he was lying or you would have mentioned it in the post.

Comment author: Matt_Stevenson 11 March 2010 03:34:34AM 1 point [-]

Even if it is a gut feeling and not an explicit lie, he is still showing that his facts are weak since he's resorting to emotions.

Comment author: Matt_Stevenson 06 March 2010 07:38:37PM *  0 points [-]

I think this is a problem that applies to a lot of people who are socially dysfunctional, not just those who are high intelligence. Generalizing from one example?

View more: Next