BillyPilgrim

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Love the idea! Some things I noticed:

The story seems to be unfolding pretty much the same, no matter the AI personality.

The human is a bit far away, a bit abstract, which leads to low emotional involvement. Maybe the human could have a name and a distinct personality that's generated? Or you could prompt the user for their name and the AI will refer to the human by that name.

In a similar vain: Somehow the AI seems to be a bit of an unreliable narrator. It will talk about restricting the freedom of the human to increase their safety, but it will frame it in a way that it's the good and necessary choice. I'm sure the diary of the human would tell a vastly different story. 

I would love to have choices. The closer they are related to the dilemmas of AI alignment the better. What if the human had a chance of dying and the obituary would say what a life they lived. And then as the user/AI you could feel regret about not keeping the human safer or maybe shrug it off and say: Well, at least they lived a full life.