I've held off on posting the next rerun for a few days in case anybody else suggested something new in the discussion of how to run the AI FOOM Debate. After looking at the comments (with associated votes), and after looking through the sequence myself, I've decided to rerun one post a day. Most of these posts will be from Robin Hanson and Eliezer Yudkowsky, but there are a few posts from Carl Shulman and James Miller that will be included as well. This process will start tomorrow with "Abstraction, Not Analogy" by Robin Hanson.

Meanwhile, there are several posts written by Robin Hanson in the week or so leading up to the debate that provide a bit of background to his perspectives, which I have linked to below. They're all fairly short and relatively straightforward, so I don't think that they each merit a full blown individual discussion.

Fund UberTool?

Engelbart As UberTool?

Friendly Teams

Friendliness Factors

Setting the Stage

 

New Comment
7 comments, sorted by Click to highlight new comments since:
[-][anonymous]30

From "Friendly Teams" ...

Small teams have at times suddenly acquired disproportionate power, and I’m sure their associates who anticipated this possibility used the usual human ways to consider that team’s “friendliness.” But I can’t recall a time when such sudden small team power came from an UberTool scenario of rapidly mutually improving tools.

In August 1945 an ubertool was demonstrated two times in Japan.

But the United States was not a "small team".

Relatedly, there have been numerous instances of individual humans taking over entire national governments by exploiting high-leverage, usually unethical, opportunities. A typical case involves a military general staging a coup or an elected leader legislating unlimited power unto himself. None of these instances look anything like firms competing for resources. Instead, we have single actors who are intelligent and opportunistic, unethical or morally atypicall, and risk-tolerant enough to accept the consequences of a failed coup.

A UFAI looks much more like a dictator than a firm.

All those require the an least implicit cooperation of a lot of other people, e.g., the general's army.

The distinction between "tool" and "agent" appears to me to be one of how much a human will comprehend it. As I noted here (prompted by this thread), a moderately complex Puppet-run system can already be a bit spooky, even when I know all the bits of what it does.

Imagine you're a small but growing firm. You can choose whether to reinvest your profits in:

  • growth
  • productivity

There are various factors which might affect which of these is the better buy:

  • if you've already captured a large share of the market, you get diminishing returns from investing in growth
  • if your business processes are already pretty efficient, you get diminishing returns from investing in productivity
  • if you're larger then researching how to improve productivity becomes more attractive. It costs the same to research the next productivity trick, but you can roll it out to a larger firm.
  • if your business processes are extremely unusual then researching productivity tricks might be more attractive, as there are more likely to be low-hanging fruit that haven't been discovered by other people

So I imagine for most firms, their ratio of growth to productivity investment lies within a particular range. We expect the UberTool caricature to fail because it is investing far too heavily in productivity instead of growth (and it's just too small to be able to do the necessary research).

So how might an AGI be different from UberTool?

  • It might actually be doing a lot of growth too ("take over the internet")
  • the extremely unusual business processes thing - in particular not having humans in the loop

What other factors would be relevant?

Just realised I've been blurring the distinction between tools and intangibles. If a firm wants to increase its efficiency, it could either design better chairs for its employees or design a better recruitment process and I've been treating these as the same kind of thing. I think the main difference is that intangibles are harder to buy and sell than tools are, and this may be relevant.