1 min read

0

In a conceivable future, humans gain the technology to eliminate physical suffering and to create interfaces between their own brains and computing devices--interfaces which are sufficiently advanced that the border between the brain and the computer practically vanishes.  Humans are able to access all public knowledge as if they 'knew' it themselves, and they can also upload their own experiences to this 'web' in real-time.  The members of this network would lose part of their individuality since an individual's unique set of skills and experiences are a foundational component of identity.

However, although knowledge can be shared for low cost, computing power will remain bounded and valuable.  Even if all other psychological needs are pacified, humans will probably still compete for access to computing power.

But what other elements of identity might still remain?  Is it reasonable to say that individuality in such a hive mind would reduce to differing preferences for the use of computational power?

New Comment
3 comments, sorted by Click to highlight new comments since:

Humans are able to access all public knowledge as if they 'knew' it themselves

I don't think this power can be added to humans without extensive modifications to brain architecture, and I don't think an entity which had this power could be considered human in any meaningful sense.

That's not to say we won't have extremely fast, brain-implant-aided information retrieval. But the information retrieved will still have to come in through the optic and auditory nerves, at a limited rate, and it will still be processed by human minds in roughly the same manner we process information now. If we discard any of those premises, it's more like engineering a new mind than modifying a human one.

[-][anonymous]00

I don't think this power can be added to humans without extensive modifications to brain architecture, and I don't think an entity which had this power could be considered human in any meaningful sense.

Agreed--this would require a qualitative difference, not a quantitative one.

If someone integrates that closely with computers, then what happens if they encounter malware? It'd be as bad as mind reading, and possibly as bad as mind control. And if it's anything like computing today, then a simple search warrant would be sufficient to read and seize the computerized portion of anyone's mind, even if seizing it leaves them crippled.