whpearson comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread. Show more comments above.

Comment author: whpearson 15 August 2010 11:02:19PM 2 points [-]

When I read good popular science books the people will tend to come up with some idea. Then they will test the idea to destruction. Poking and prodding at the idea until it really can't be anything but what they say it is.

I want to get the same feeling off the group studying intelligence as I do from that type of research. They don't need to be running foomable AIs, but truth is entangled so they should be able to figure out the nature of intelligence from other facets of the world, including physics and the biological examples.

Questions I hope they would be asking:

Is the g factor related to ability to absorb cultural information? I.e. is peoples increased ability to solve problems if they have a high g due to them being able to get more information about solving problems from cultural information sources?

If it wasn't then it would be further evidence for .something special in one intelligence over another and it might make sense to call one more intelligent, rather than just having different initial skill sets.

If SIAI had the ethos I'd like, we'd be going over and kicking every one of the supporting arguments for the likelihood of fooming and the nature of intelligence to make sure they were sound. Performing experiments where necessary. However people have forgotten them and moved on to decision theory and the like.

Comment author: NancyLebovitz 15 August 2010 11:13:30PM 4 points [-]

Interesting points. Speaking only for myself, it doesn't feel as though most of my problem solving or idea generating approaches were picked up from the culture, but I could be kidding myself.

For a different angle, here's an old theory of Michael Vassar's-- I don't know whether he still holds it. Talent consists of happening to have a reward system which happens to make doing the right thing feel good.

Comment author: Jonathan_Graehl 16 August 2010 11:02:10PM *  0 points [-]

Talent consists of happening to have a reward system which happens to make doing the right thing feel good.

Definitely not just that. Knowing what the right thing is, and being able to do it before it's too late, are also required. And talent implies a greater innate capacity for learning to do so. (I'm sure he meant in prospect, not retrospect).

It's fair to say that some of what we identify as "talent" in people is actually in their motivations as well as their talent-requisite abilities.

Comment author: Perplexed 15 August 2010 11:35:48PM 2 points [-]

If SIAI had the ethos I'd like, we'd be going over and kicking every one of the supporting arguments for the likelihood of fooming and the nature of intelligence to make sure they were sound.

And then, hypothetically, if they found that fooming is not likely at all, and that dangerous fooming can be rendered nearly impossible by some easily enforced precautions/regulations, what then? If they found that the SIAI has no particular unique expertise to contribute to the development of FAI? An organization with an ethos you would like: what would it do then? To make it a bit more interesting, suppose they find themselves sitting on a substantial endowment when they reason their way to their own obsolescence?

How often in human history have organizations announced, "Mission accomplished - now we will release our employees to go out and do something else"?

Comment author: timtyler 16 August 2010 06:09:37AM *  1 point [-]

It doesn't seem likely. The paranoid can usually find something scary to worry about. If something turns out to be not really-frightening, fear mongers can just go on to the next-most frightening thing in line. People have been concerned about losing their jobs to machines for over a century now. Machines are a big and scary enough domain to keep generating fear for a long time.

Comment author: ciphergoth 16 August 2010 08:59:27AM 2 points [-]

I think that what SIAI works on is real and urgent, but if I'm wrong and what you describe here does come to pass, the world gets yet another organisation campaigning about something no-one sane should care about. It doesn't seem like a disastrous outcome.

Comment author: NancyLebovitz 16 August 2010 08:06:36AM 1 point [-]

From a less cynical angle, building organizations is hard. If an organization has fulfilled its purpose, or that purpose turns out to be a mistake, it isn't awful to look for something useful for the organization to do rather than dissolving it.

Comment author: Perplexed 17 August 2010 03:19:13AM 2 points [-]

The American charity organization, The March of Dimes was originally created to combat polio. Now they are involved with birth defects and other infant health issues.

Since they are the one case I know of (other than ad hoc disaster relief efforts) in which an organized charity accomplished its mission, I don't begrudge them a few additional decades of corporate existence.

Comment author: JamesAndrix 15 August 2010 11:53:58PM 0 points [-]

Then they will test the idea to destruction.

I like this concept.

Assume your theory will fail in some places, and keep pressing it until it does, or you run out of ways to test it.