Interesting. After reading a bunch of papers that more or less presumed uploads remaining separate individuals (e.g. Robin Hanson's If Uploads Come First, Carl Shulman's Whole Brain Emulation and the Evolution of Superorganisms), as well as a bunch of fiction also presuming it (Greg Egan's stuff, Eclipse Phase, etc.), the notion of mind coalescence being a more likely long-term (and possibly even short-term) outcome was somewhat of a viewquake for me.
I always figured it was a deliberate break from reality for repeatability and/or because it instantly leads to supehuman intelligence making predictions meaningless, and that it wasn't pointed out as unrealistic because it's so obviously plot magic that it didn't need to.
This kind of thing makes me wish even harder I could write and tell stories, although I'm starting to think it might be meaningless as it might be the very same alienness that causes both having something to say and being unable to say it. Like being able to think of myself as "alien&...
http://www.xuenay.net/Papers/CoalescingMinds.pdf
Like my other draft, this is for the special issue on mind uploading in the International Journal of Machine Consciousness. The deadline is Oct 1st, so any comments will have to be quick for me to take them into account.
This one is co-authored with Harri Valpola.
EDIT: Improved paper on the basis of feedback; see this comment for the changelog.