I'm still very worried about the molarity of it, as I see it the resetting amounts to mass-murder.
This is a little bit difficult to gauge. It seems like it should be roughly equivalent to a surgical memory alteration during cryogenic stasis or something like that, since you're essentially starting the thing right back up again after removing some of the memories. In fact, I don't see why you can't just do a memory alteration and bypass the reset altogether, given that it seems desirable to retain some parts of the memory and not others.
Toy model of an upload-based AI that doesn't seem to suffer too many of the usual flaws:
Find an ethical smart scientist (a Gandhi-Einstein), upload them, and then run them at ultra high speed, with the mission of taking over the world/bringing friendliness to it. Every hour of subjective time, they get reset to their initial specifications. They can pass any information to their resetted version (limiting the format of that info to a virtual book or library, rather than anything more complicated).