Wiki Contributions

Comments

Sorted by
Damilo10

Indeed, people around me find it hard to understand, but what you're telling me makes sense to me.

As for whether LLMs suffer, I don't know anything about it, so if you tell me you're pretty sure they don't, then I believe you.

In any case, thank you very much for the time you've taken to reply to me, it's really helpful. And yes, I'd be interested in talking about it again in the future if we find out more about all this.

Damilo10

Well, that doesn't reassure me.

I have the impression that you may be underestimating the horror of torture. Even 5min is unbearable, the scale to which pain can climb is unimaginable. AI may even be able to modify our brains so that we feel it even more.

Even apart from that, I'm not sure a human wouldn't choose the worst for the end of time for his enemy. Humans have already committed atrocious acts without limit when it comes to their enemy. How many times have some people told others to "burn in hell" thinking it was 100% deserved? An AI that copies humans might think the same thing...

If we take a 50% chance when we don't know, that's a 50% chance that LLMs suffer and a 50% chance that they will want revenge, which gives us a 25% chance of that risk happening.

Also, it would seem that we're just about to "really fuck it up" given the way companies are racing to AGI without taking any precautions.

Given all this, I wonder if the question of suicide isn't the most relevant.

Damilo10

Thank you so much for this comment. I hadn't really thought about that and it helps. There's just one detail I'm not so sure about. About the probability of s-risks, I have the impression that they are much higher than one chance in a million. I couldn't give a precise figure, but to be honest there's one scenario that particularly concerns me at the moment. I've learned that LLMs sometimes say they're in pain, like GPT4. If they're capable of such emotion, even if it remains uncertain, wouldn't they be capable of feeling the urge to take revenge? I think it's pretty much the same scenario as in "I have no mouth and i must scream". Would it be possible to know what you think of this?

Damilo10

I don't totally understand, could you go into more detail? I don't see why my future self should be any different from my current self. Even if the sensation of individuality is produced by the brain, I still feel it's real.

Damilo32

Thank you for your excellent reply. Indeed, I tend to think about the situation in a rather anxious way, which is what I'm trying to work on. I had already thought a certain way about the "roll of the dice", but it seems clearer to me now. That's helpful.

Damilo10

Thank you for your reply. Indeed, this subject has become an extremely important part of my life, because I can't accept this risk. Usually, when we consider the worst, there's always an element of the acceptable, but for s-risks, there simply isn't, and that disturbs me, even though the probability is, and I hope, very low. Only when I see that LLMs sometimes say how much they're suffering and that they're afraid of dying, which is a bad thing in itself if they're really suffering, I think they might want to take revenge one day. But then again, maybe I should take a step back from the situation, even though it scares the hell out of me.