Shulman on Superorgs

Best to read the link first and my comments later.

I have very little to comment on the topic itself, but I do find it odd that Robin takes such a confrontational stance, beginning from the first sentence "It has come to my attention that some think that by now I should have commented on Carl Shulman’s em paper" and culminating with a harsh analysis not only of Carl's conclusions, but about what (Robin believes) made him want to reach those conclusions, as well as SIAI's mission statement in general. There is negative framing, "obsession with making a god to rule us all (well)", that I wouldn't expect from someone trying to honestly represent the other side. It's not that I don't share some of those concerns, but to psychoanalyse (who you seem to have identified as) your opponent in an obvious effort to discredit, is at the very least unfair.  I was generally aware that there was some kind of tension between the former dynamic duo of Hanson - Yudkowsky, but it seems to have become full-blown hostility.

Robin does seem to find the courage to say he's glad others are looking into emulations, but the overall vibe I get is of someone protective of a research field they believe they uniquely 'get', someone who feels others should just get in line or get out of the ring, and it's a vibe not uncommon in academia.

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 2:22 AM

I didn't mean to be harsh or confrontational. Surely it isn't a mere coincidence that in the context of an institute devoted to figuring out how to create a good singleton someone wrote a paper about how ems would result in singletons. That isn't psychology, but social context. I'd love for more folks to study future em society, even if they disagree with me.

to psychoanalyse (who you seem to have identified as) your opponent in an obvious effort to discredit, is at the very least unfair.

[...]

Robin does seem to find the courage to say he's glad others are looking into emulations, but the overall vibe I get is of someone protective of a research field they believe they uniquely 'get', someone who feels others should just get in line or get out of the ring, and it's a vibe not uncommon in academia.

This conjunction almost gave me whiplash.

I do not agree with Robin's argument that:

When the parts of large systems evolve independently, to adapt to differing local circumstances, their values may also evolve independently.

Light travels fast, and information is cheap. The parts of large systems seem unlikely to "evolve independently" in the future - rather they will communicate and pool their knowledge, since they will have been built to do so.

In particular colonised regions will do R+D and then beam energy and information to the settlers on the colonisation front - where it is needed - while the setters will send back news of the environments they are encountering.

By "independently" I do not mean no interdependence. I mean not completely interdependent. You can't predict one perfectly just by looking at the others.

Perfect prediction seems like a rather demanding criterion. I can't predict the behaviour of one of my cells perfectly - but that has more to do with difficulty in establishing initial conditions, a lack of knowledge of the laws of physics and computational intractability than it does with a lack of shared heritable material.

Also: some value drift within a single large organism may be seen as being permissible. The galactic federation may tolerate a few rebels - the point is more that it exhibits large scale unity and is not threatened by the rebels - because they are too few or too weak.

Serious disharmony would have to be something larger - disagreement about which side of the galaxy will launch an intergalactic colonisation mission, for instance.

It is also far from obvious that values in generic large minds can easily be separated from other large mind parts.

This point alone would, if true, undermine the argument of Shulman's that Hanson discusses. It deserves more emphasis. And it seems correct.

Both Carl and Robin presume that brain emulations will be economically significant - which seems very unlikely. We may eventually be able to back up minds, but by then we will have intelligent machines by other means that will be doing most of civilisation's cognitive work.

Carl's section about "unrestrained Malthusian competition" seems rather paranoid. It exhibits "cold-war"-era thinking. Advanced organisms are much more likely to trade with one another - which seems relatively unlikely to result in existential risk.

Both parties avoid discussion of what will probably determine whether such a single large entity forms or not - a cosmic monopolies and mergers commission. Robin presumably thinks such a thing will be unnecessary - on the grounds that coordination is so hard. Why Carl doesn't discuss it is less obvious. Perhaps he thinks it is obviously a product of a primitive political system. Today, some political thinking suggests that preventing large-scale cooperation is desirable, and that deliberate fragmentation is the way to go. If that perspective continues to dominate, it seems relatively unlikely that a unified system will arise - since the monopolies and mergers commission would destroy any such unity.

I don't presume that brain emulation will come first and be significant, and indeed think that it probably won't. The paper explored some issues relevant conditional on that turning out to happen anyway, including some that can be generalized to non brain emulation scenarios.

Regarding Malthusian competition, check out "burning the cosmic commons".

The monopolies commission you describe would be a singleton under Bostrom's account, capable of overcoming any local challenge to its authority.

I don't presume that brain emulation will come first and be significant, and indeed think that it probably won't.

OK, good to know. The idea in the paper is attributed to "Many scientists".

Regarding Malthusian competition, check out "burning the cosmic commons".

Yes, I am familiar with that. If you don't like what natural selection offers, one wonders just how slow, bloated and inefficient a civilisation is considered to be desirable - and how much of it would survive eventual contact with aliens.

The monopolies commission you describe would be a singleton under Bostrom's account, capable of overcoming any local challenge to its authority.

Yes, that is true.