Comment author: mattnewport 11 January 2010 12:55:54AM 4 points [-]

I might be wrong in my beliefs about their best interests, but that is a separate issue.

It's not a separate issue, it's the issue.

You want me to take as given the assumption that undergoing the treatment is in everyone's best interests but we're debating whether that makes it legitimate to force the treatment on people who are refusing it. Most of them are presumably refusing the treatment because they don't believe it is in their best interests. That fact should make you question your original assumption that the treatment is in everyone's best interests, or you have to bite the bullet and say that you are right, they are wrong and as a result their opinions on the matter can just be ignored.

Comment author: Fredrik 11 January 2010 02:17:57AM 1 point [-]

Just out of curiosity, are you for or against the Friendly AI project? I tend to think that it might go against the expressed beforehand will of a lot of people, who would rather watch Simpsons and have sex than have their lives radically transformed by some oversized toaster.

Comment author: mattnewport 10 January 2010 10:31:25PM 6 points [-]

If you can raise the floor for everyone, so that we're all just better, what's not to like about giving everybody that treatment?

The same that's not to like about forcing anything on someone against their will because despite their protestations you believe it's in their own best interests. You can justify an awful lot of evil with that line of argument.

Part of the problem is that reality tends not to be as simple as most thought experiments. The premise here is that you have some magic treatment that everyone can be 100% certain is safe and effective. That kind of situation does not arise in the real world. It takes a generally unjustifiable certainty in the correctness of your own beliefs to force something on someone else against their wishes because you think it is in their best interests.

Comment author: Fredrik 11 January 2010 12:48:08AM -2 points [-]

I might be wrong in my beliefs about their best interests, but that is a separate issue.

Given the assumption that undergoing the treatment is in everyone's best interests, wouldn't it be rational to forgo autonomous choice? Can we agree on that it would be?

Comment author: JulianMorrison 10 January 2010 06:43:44PM 1 point [-]

So, we could decompile humans, and do FAI to them. Or we could just do FAI. Isn't the latter strictly simpler?

Comment author: Fredrik 11 January 2010 12:41:10AM 5 points [-]

Well, the attention of those capable of solving FAI should be undivided. Those who aren't equipped to work on FAI and who could potentially make progress on intelligence enhancing therapies, should do so.

Comment author: timtyler 10 January 2010 01:56:27PM *  -1 points [-]

Culture is what separates us from cavemen. They often killed their enemies and ate their brains. Clearly culture can be responsible for a great deal of change in the domain of moral behaviour.

Comment author: Fredrik 10 January 2010 10:30:20PM 1 point [-]

Culture has also produced radical Islam. Just look at http://www.youtube.com/watch?v=xuAAK032kCA to get a bit more pessimistic about the natural moral zeitgeist evolution in culture.

Comment author: mattnewport 10 January 2010 08:41:55PM 3 points [-]

If I was convinced of the safety and efficacy of an intelligent enhancing treatment I would be inclined to take it and use my enhanced intelligence to combat any government attempts to mandate such treatment.

Comment author: Fredrik 10 January 2010 10:23:47PM 0 points [-]

So individual autonomy is more important? I just don't get that. It's what's behind the wheels of the autonomous individuals that matters. It's a hedonic equation. The risk that unaltered humans pose to the happiness and progress of all other individuals might just work out to "way too fracking high".

It's everyone's happiness and progress that matters. If you can raise the floor for everyone, so that we're all just better, what's not to like about giving everybody that treatment?

Comment author: billswift 10 January 2010 01:20:25PM 2 points [-]

Not just "rotten eggs" either. If there is one thing that I could nearly guarantee to bring on serious opposition from independent and extremely intelligent people, that is convince people with brains to become "criminals", it is mandating gov't meddling with their brains. I, for example, don't use alcohol or any other recreational drug, I don't use any painkiller stronger than ibuprofen without excrutiating (shingles or major abcess level) pain, most of the more intelligent people I know feel to some extent the same, and I am a libertarian; do you really think I would let people I despise mess around with my mind?

Comment author: Fredrik 10 January 2010 05:26:31PM *  -1 points [-]

You don't have to trust the government, you just have to trust the scientists who developed the drug or gene therapy. They are the ones who would be responsible for the drug working as advertised and having negligible side-effects.

But yes, I sympathize with you, I'm just like that myself actually. Some people wouldn't be able to appreciate the usefulness of the drug, no matter how hard you tried to explain to them that it's safe, helpful and actually globally risk-alleviating. Those who were memetically sealed off to believing that or just weren't capable of grasping it, would oppose it strongly - possiby enough to base a war on the rest of the world on it.

It would also take time to reach the whole population with a governmentally mandated treatment. There isn't even a world government right now. We are weak and slow. And one comparatively insane man on the run is one too many.

Assuming an efficient treatment for human stupidity could be developed (and assuming that would be a rational solution to our predicament), then the right thing to do would be delivering it in the manner causing the least bit of social upheaval and opposition. That would be a covert dispersal, most definitely. A globally coordinated release of a weaponized retro virus, for example.

We still have some time before even that can be accomplished, though. And once that tech gets here we have the hugely increasing risk of bioterrorism or just accidental catastrophies by the hand of some clumsy research assistant, before we have a chance to even properly prototype & test our perfect smart drug.

Comment author: Roko 10 January 2010 12:52:17AM *  1 point [-]

This is borderline impossible in a liberal democracy... Now imagine actually trying to convince people that the government should be allowed to mess around with their brains

In the Q&A at 15:30, he opines that it will take the first technologically enabled act of mass terrorism to persuade people. I agree: I don't think anything will get done on x-risks until there's a gigadeath event.

Comment author: Fredrik 10 January 2010 03:44:07AM *  0 points [-]

Even in such a scenario, some rotten eggs would probably refuse the smart drug treatment or the gene therapy injection - perhaps exactly those who would be the instigators of extinction events? Or at least the two groups would overlap somewhat, I fear.

I'm starting to think it would be rational to disperse our world-saving drug of choice by means of an engineered virus of our own, or something equally radically effective. But don't quote me on that. Or whatever, go ahead.

Comment author: Fredrik 10 January 2010 03:37:28AM 1 point [-]

X-risk-alleviating AGI just has to be days late to the party for a supervirus created by a terrorist cell to have crashed it. I guess I'd judge against putting all our eggs in the AI basket.

In response to Regular NYC Meetups
Comment author: Fredrik 02 October 2009 07:31:00PM 1 point [-]

I wonder how many Swedish readers there are. A meetup in Stockholm or Gotheburg would be kind of nice.

Comment author: Eliezer_Yudkowsky 19 August 2009 03:33:33PM 1 point [-]

I didn't. Nice.

Comment author: Fredrik 11 September 2009 10:46:38PM 0 points [-]

So you haven't read his Sweet Dreams: Philosophical Obstacles to a Science of Consciousness?

View more: Prev | Next