A superorganism is a group of individual organisms acting as a single organism. In the context of biology, this refers to unit of eusocial animals, such as ants. In the context of LessWrong, this usually refers to a collection of uploaded minds. These minds could be many individual scans of humans or many copies of the same scan.
The dangers of a superorganism of uploads was discussed by Carl Shulman in "Whole Brain Emulation and the Evolution of Superorganisms". He describes several enormous advantages that superorganisms would have that would encourage their evolution. These advantages mostly stem from the ease of duplicating uploads and preserving individuals with beneficial traits. Since uploads are electronic data, they can be copied and sent to other computers at almost no cost. This could lead to a superorganism of trillions of uploads which all share a common goal and knowledge. If a subgroup of these uploads were modified such that they would sacrifice for the benefit of the group, this subgroup would quickly dominate the group.
This superorganism would then rapidly out-compete biological humans as well as more self-concerned uploads. For instance, since members of a superorganism don't mind being deleted for the good of the group, experiments could be run which create many modified copies, determine which are best for a given task, delete the rest, and make many copies of the remaining copy. Because it would be evolutionarily advantageous, the uploads' goal structures may start to drift far away from those of current humans. Therefore this possible result of uploading can be considered an existential risk.