Erlja Jkdf.

Wiki Contributions

Comments

Sorted by

Sidles closer

Have you heard of... philosophy of universal norms?

Perhaps the human experience thus far is more representative then the present?

Perhaps... we can expect to go a little closer to it when we push further out?

Perhaps... things might get a little more universal in this here cluttered with reality world.

So for a start...

Maybe people are right to expect things will get cool...

I think that's a bad beaver to rely on, any way you slice it. If you're imagining, say, GPT-X giving us some extremely capable AI, then it's hands-on enough you've just given humans too much power. If we're talking AGI, I agree with Yudkowsky; we're far more likely to get it wrong then get it right.

If you have a different take I'm curious, but I don't see any way that it's reassuring.

IMO we honestly need a technological twist of some kind to avoid AI. Even if we get it right; life with a God just takes a lot of the fun out of it.

There's a problem I bet you haven't considered.

Language and storytelling are hand-me-downs from times full of bastards. The linguistic bulk, and the more basic and traditional mass of stories, are going to be following more brutal patterns.

The deeper you dig, the more likely you end up with a genius in the shape of an ancient asshole.

And the other problem; all these smarter intelligences running around, simply by fact of their intelligence, has the potential to make life a real headache. Everything could end up so complicated.

One more bullet we have to dodge really.

Is this perhaps because the top end is simply not high enough yet?

The point is it's a near-term risk and only building on what they can already simulate.

They would be smarter at birth. Either you gene-edit your kids or you pass that up. Yes, some people would do it; and yes, you'd get genius proliferation. But so long as you've got enough hide-bound naturists, fully committed, you would always have some eco-warriors around.

There's no such thing as a million fully committed naturists, and that's why the planet is cooking and the endangered list keeps growing.

We're very good at generating existential risks. Given indefinite technological progression at our current pace, we are likely to get ourselves killed.

A technological plateau is strictly necessary. To give the simplest example; we lucked out on nukes. The next decade alone contains potential for several existential threats - readily made bioweapons, miniaturized drones, AI abuse - that I question our ability to consistently adapt too, particularly one after another.

We might get it, if our tech jumps thanks to exponential progress.