Welcome to the "Stupid Questions" thread! Feel free to ask any questions, regardless of whether they seem obvious, tangential, silly, or what-have-you. Don't be shy - everyone has gaps in their knowledge, and the goal here is to help reduce them.
Please remember to be respectful when someone admits ignorance and don't mock them for it. They are doing a noble thing by seeking knowledge and understanding. Let's create a supportive and kind environment for learning and growth!
If an AGI achieves consciousness, why would its values not drift towards optimizing its own internal experience, and away from tiling the lightcone with something?
That's a much more complex goal than wireheading for a digital mind that can self-modify.
In any case, those agents that care a lot about getting more power over the world are more likely to get power than agents that don't.