Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
CSalmon10

Time-accelerated research with forking seems like the only safe (sane) thing one could do with WBEs. The human brain breaks easily if anything too fancy is attempted. Not to mention that unlike UFAIs which tile the universe with paperclips or something equally inane insane neuromorphs might do something even worse instead, a la "I Have No Mouth And I Must Scream". The problem of using fiction as evidence is evident but since most fictional UFAIs are basically human minds with a thin covering of what the author thinks AIs work like I think the fail isn't so overwhelmingly strong with this one; unlike a "true" AGI a badly designed neuromorph might definitely feel resentment towards a low tribal status, for example. The risks of having a negative utilitarian or a deep ecologist as an emulation whose mind might be severely affected by unknown factors are something Captain Obvious would be very enthusiastic about.

Even simple time acceleration with reasonable sensory input might have unforeseen problems when the rest of the world runs in slow motion. Multiply it if there is only one mind instead of several that can still have some kind of a social life with each other. Modify the brain to remove the need for human interaction and you're back in the previous paragraph.

Now, considering the obvious advantages of copying the best AI researchers and running them multiple times faster than real-time it would be quite reasonable to expect AGI to follow quickly after accelerated WBE. This could mean that WBE might not be very widespread; the organization that had the first fast EMs would probably focus on stacking supercomputing power to take over the future light cone instead of just releasing the tech to take over the IT / human enhancement sector.

If the organization is sane it could significantly increase the chance of FAI as the responsible researchers had both speed and numbers advantage over the non-WBE scenario. On the other hand, if the first emulators don't focus on creating human-compatible AGI the chance of Fail would be the one growing massively.

CSalmon20

Scandinavian countries top the indexes on metrics other than taxation, government spending and "labour freedom" while the monarchs (and arguably, the churches) are mainly if not solely symbolic. If labels are ignored I think "socially permissive, high taxes, major redistribution of wealth" describes these countries very well.

CSalmon110

My desire and wish is that the things I start with should be so obvious that you wonder why I spend my time stating them. This is what I aim at because the point of philosophy is to start with something so simple as not to seem worth stating, and to end with something so paradoxical that no one will believe it.

-- Bertrand Russell, The Philosophy of Logical Atomism

CSalmon310

Rin: What are clouds? I always thought they were thoughts of the sky or something like that. Because you can't touch them.

[ . . . ]

Hisao: Clouds are water. Evaporated water. You know they say that almost all of the water in the world will at some point of its existence be a part of a cloud. Every drop of tears and blood and sweat that comes out of you, it'll be a cloud. All the water inside your body too, it goes up there some time after you die. It might take a while though.

Rin: Your explanation is better than any of mine.

Hisao: Because it's true.

Rin: That must be it.

Katawa Shoujo