ChristianKl comments on Savulescu: "Genetically enhance humanity or face extinction" - Less Wrong

4 [deleted] 10 January 2010 12:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (193)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChristianKl 15 January 2010 03:59:03PM 1 point [-]

The notion that higher IQ means that more money will be allocated to solving FAI is idealistic. Reality is complex and the reason for which money gets allocated are often political in nature and depend on whether institutions function right. Even if individuals have a high IQ that doesn't mean that they don't fall in the group think of their institution.

Real world feedback however helps people to see problem regardless of their intelligence. Real world feedback provides truth when high IQ can just mean that you are better stacking ideas on top of each other.

Comment deleted 15 January 2010 05:20:59PM *  [-]
Comment author: ChristianKl 16 January 2010 12:20:35AM 0 points [-]

Some sub-ideas of a FAI theory might be put to test in artificial intelligence that isn't smart enough to improve itself.

Comment author: Morendil 15 January 2010 05:41:32PM *  0 points [-]

"Editing the mental states of ems" sounds ominous. We would (at some point) be dealing with conscious beings, and performing virtual brain surgery on them has ethical implications.

Moreover, it's not clear that controlled experiments on ems, assuming we get past the ethical issues, will yield radical insight on the structure of intelligence, compared to current brain science.

It's a little like being able to observe a program by running it under a debugger, versus examining its binary code (plus manual testing). Yes this is a much better situation, but it's still way more cumbersome than looking at the source code; and that in turn is vastly inferior to constructing a theory of how to write similar programs.

When you say you advocate intelligence augmentation (this really needs a more searchable acronym), do you mean only through genetic means or also through technological "add-ons" ? (By that I mean devices plugging you into Wikipedia or giving you access to advanced math skills in the same way that a calculator boosts your arithmetic.)

Comment deleted 15 January 2010 05:58:00PM [-]
Comment author: Vladimir_Nesov 15 January 2010 09:38:45PM 2 points [-]

To whoever downvoted Roko's comment -- check out the distinction between these ideas:

Comment author: ciphergoth 16 January 2010 11:20:15AM *  1 point [-]

I'd volunteer and I'm sure I'm not the only one here.

Comment author: AdeleneDawner 16 January 2010 11:47:06AM 0 points [-]

You're not, though I'm not sure I'd be an especially useful data source.

Comment author: RobinZ 16 January 2010 03:38:33PM 0 points [-]

I've met at least one person who would like a synesthesia on-off switch for their brain - that would make your data useful right there.

Comment author: AdeleneDawner 17 January 2010 05:49:12AM 1 point [-]

Looks to me like that'd be one of the more complicated things to pull off, unfortunately. Too bad; I know a few people who'd like that, too.

Comment author: Morendil 15 January 2010 07:59:57PM 0 points [-]

Please expand on what "the end" means in this case. What do you expect we would gain from perfecting whole-brain emulation, I assume of humans ? How does that get us out of our current mess, exactly ?

Comment deleted 15 January 2010 05:55:33PM [-]
Comment author: Vladimir_Nesov 15 January 2010 09:34:49PM *  0 points [-]

I worry these modified ems won't share our values to a sufficient extent.

Comment deleted 15 January 2010 10:43:37PM [-]
Comment author: Vladimir_Nesov 15 January 2010 11:03:21PM *  1 point [-]

Possibly. But I'd rather use selected human geniuses with the right ideas copied and sped up, and wait for them to crack FAI before going further (even if FAI doesn't give a powerful intelligence explosion -- then FAI is simply formalization and preservation of preference, rather than power to enact this preference).