XiXiDu comments on $295 bounty for new Singularity Institute logo design (crowd-sourced competition) - Less Wrong

10 Post author: Louie 28 January 2011 06:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 28 January 2011 01:06:56PM *  5 points [-]

Any suggestions?

  • Institute for Artificial Intelligence
  • Institute for Artificial Ethics
  • Institute of Formal Ethics
  • Institute for AI Ethics
  • Institute for the Conservation of Friendliness
  • The Friendliness Institute

Any of those names would increase its esteem dramatically. Right now you have to take the Singularity hurdle before you can start talking about risks from AI seriously.

Comment author: [deleted] 28 January 2011 01:45:44PM 3 points [-]

Right now you have to take the Singularity hurdle before you can start talking about risks from AI seriously.

That's definitely true, but it would be dishonest if SIAI wasn't up front about its views on the Singularity.

Comment author: ata 28 January 2011 08:19:30PM *  4 points [-]

That's definitely true, but it would be dishonest if SIAI wasn't up front about its views on the Singularity.

There's enough alternative terminology (intelligence explosion, hard takeoff, etc.) that they could (and often do (e.g. the AI Risk paper)) manage to talk about the Singularity just fine without talking about the "Singularity". I don't think anyone's suggesting that they not be upfront about their actual positions on those issues; some people would just prefer to avoid the "Singularity" terminology now that it's turned into a bloated mutant futurist meme complex. (So really, the name change suggestions are about being less accidentally misleading; they wouldn't have to spend as much time explaining that they aren't Ray Kurzweil. And hey, if they abandon the "artificial intelligence" terminology, maybe they won't have to spend so much time explaining that they aren't building Skynet or Terminators or HAL or the Matrix, either! ..but probably not.)

Comment author: Perplexed 28 January 2011 02:48:38PM *  2 points [-]

I have the impression that the Singularity Summit and other association with "big-tent" singularitarianism provide some PR and fund-raising advantages. Even if SIAI research focuses on AI and Friendliness, Thiel probably prefers that they remain on speaking terms with futurism and transhumanism more generally.

Comment author: XiXiDu 28 January 2011 04:42:24PM *  4 points [-]

Big money is probably the the best argument. But you also have to ask who you need to convince to mitigate risks from AI, e.g. who you want to implement your formal definition of friendliness. The answer to this question is likely not Thiel but some AGI researcher working on an academic project, some big corporation like IBM or a government. In most cases having Singularity in your name or talking about transhumanism will make them delete your e-Mails. This has nothing to do with being dishonest but social engineering, being aware of public perception and using an euphemism if possible.

Comment author: Perplexed 28 January 2011 05:25:17PM 8 points [-]

If the SI were known for two things - both organizing the annual Singularity Summit and also sponsoring a peer-reviewed Electronic Journal of Friendly AI Research (EJFAIR), then I suspect it would have the best of both worlds. Neither activity is so disreputable as to tarnish the good reputation derived from the other (within relevant subcultures).

To my mind, the real threat to SI (or SIAI) credibility is the perception that it is an isolated intellectual subculture, speaking only to itself, and not engaged in an open and critical dialog with other AI researchers. Choosing the right euphemism for use in the name of the organization is not the most important task in this PR battle.

Comment author: XiXiDu 28 January 2011 08:18:35PM *  1 point [-]

Choosing the right euphemism for use in the name of the organization is not the most important task in this PR battle.

Absolutely, I was just bringing it up because Jack asked about a possible name change. It may also be the case that the term Singularity (technological) will gain general acceptance in future, as current progress towards an academic analysis suggests.

Comment author: timtyler 29 January 2011 02:04:20AM *  -1 points [-]

It may also be the case that the term Singularity (technological) will gain general acceptance in future, as current progress towards an academic analysis suggests.

It would be unprecedented terminology.

Evolutionary changes have been called "revolutions" (industrial), "explosions" (cambrian), "takeovers" (genetic), "catastrophe" (oxygen), or "genesis" (abiogenesis).

There aren't any "singularities" on record, though.

I prefer "Technology Explosion". I think it is more descriptive and more accurate. There are also the terms "Digital Revolution" and "Memesis" - which I approve of.

Comment author: TheOtherDave 29 January 2011 02:24:52AM 3 points [-]

Well, presumably if there was more than one of it, it wouldn't be a singularity.

Comment author: timtyler 29 January 2011 11:35:52AM *  3 points [-]

Well, presumably if there was more than one of it, it wouldn't be a singularity.

Note that that isn't what the people who use the "singularity" term to describe the hypothetical discontinity in the middle of black holes seem to think.

Comment author: false_vacuum 31 January 2011 01:16:20PM *  3 points [-]

Somebody should point out that TheOtherDave's comment was a joke. (At least, I'm 95% confident it was so intended, and also I found it amusing.)

The Singularity is more analogous to the event horizon of a black hole; that used to be called the Schwarzschild singularity, but since its singular behaviour was an artifact of co-ordinate systems (as was first clearly shown by David Finkelstein), this terminology has fallen out of use. One imagines that eventually historical Singularities (which are really just prediction horizons) cease to be so called after they are past.

The singularity at the center of a black hole is presumably equally illusory; general relativity simply breaks down there, giving nonsensical answers (infinities). Singularities are in the map, not the territory. (Am I the first to say that?)

Comment author: Jack 28 January 2011 06:00:24PM 1 point [-]

I mean, yeah, you call the organization whatever the hell Thiel wants you to call it. But I don't see in particular why those connections couldn't be maintained alongside a name change that made the organization more palatable to non-futurists.

Comment author: Jack 28 January 2011 05:57:21PM *  4 points [-]

Center for Machine Ethics (or Institute for...)

Institute for Reducing Existential Risk

Center for the Reduction of Existential Risk

"Machine ethics" is, as far as I can tell, the actual name in the literature (insofar as there is any). I also really like how it sounds. The general "Reducing existential risk" angle probably requires the organization to get broader in it's focus and focus on other risks (or else be seen as deceptive). But that might not be a bad thing and it may be the way to getting a lot more money and being seen as a mainstream charity.

Also I'm not sure how sold I am on "Institute" it's got a 'we're pretending we're associated with academia' feel to it. I think "Center for..." sounds a lot better.

Comment author: Normal_Anomaly 28 January 2011 08:22:53PM 3 points [-]

I like "Center for Machine Ethics".

Also I'm not sure how sold I am on "Institute" it's got a 'we're pretending we're associated with academia' feel to it. I think "Center for..." sounds a lot better.

That's a good point. "Center for ..." sounds more like a non-profit and less like an ivory tower.

Comment author: ata 28 January 2011 08:35:35PM 2 points [-]

I'm not sure about including "Machine Ethics"; given that there already is such a field, and given that (AFAICT) it does not generally involve precision-grade philosophy suitable for (let alone intended for) the construction of a Benevolent Really Powerful Optimization Process, it may be misleading to appropriate that name.

I do like the suggestion to use "Center for...", though "Institute" doesn't necessarily sound like a connection to academia is being implied (at least to me — do you think this is incorrect?).

Comment author: Jack 28 January 2011 08:56:20PM 1 point [-]

I'm not sure about including "Machine Ethics"; given that there already is such a field, and given that (AFAICT) it does not generally involve precision-grade philosophy suitable for (let alone intended for) the construction of a Benevolent Really Powerful Optimization Process, it may be misleading to appropriate that name.

Hard to say. I feel like the Friendliness question is a natural fit for the field- in fact it seems plausible to me that it is the machine ethics equivalent of unified field theory. You're right, though that the field mostly deals with minor, less rigorous issues. I don't know- my criterion for the name issue is basically "Could I tell family and friends I was working at a place with this name without being laughed at or getting strange looks."

I do like the suggestion to use "Center for...", though "Institute" doesn't necessarily sound like a connection to academia is being implied (at least to me — do you think this is incorrect?).

Not strictly speaking, no. Center is less pretentious, though.

Center for Technology and Existential Risk?

Comment author: timtyler 29 January 2011 01:55:26AM 0 points [-]

"Machine ethics" is, as far as I can tell, the actual name in the literature

"Machine morality" has better alliteration.

Comment author: NancyLebovitz 28 January 2011 05:50:18PM 4 points [-]

Institute for AI Ethics strikes me as the clearest of the bunch.

Would The Institute for AI Friendliness come off as too weird?

Comment author: Normal_Anomaly 28 January 2011 08:21:33PM 4 points [-]

Would The Institute for AI Friendliness come off as too weird?

I think it would, yes.

Comment author: nazgulnarsil 28 January 2011 01:37:55PM 1 point [-]

I like institute of formal ethics.

Comment author: ata 28 January 2011 08:28:08PM *  0 points [-]

I like "Formal Ethics" or "AI Ethics". Including "Friendliness" in the name would probably be a mistake.

I was thinking something like "the Optimization Institute", or something else with the word "optimization" — since it's relevant to their research (and distinguishes their focus on precisely-understood optimization algorithms from other AGI projects that focus on often-nebulous and often-anthropomorphic views of intelligence), and doesn't have significant preexisting connotations, while still hopefully sounding a bit intriguing to people who haven't heard of it before.