As a piece of meta advice for how to act when you have more power than you probably should, avoid doing things that cannot be undone. Creating a new sentient being is one of those things to avoid. If you need to rewrite the source code of a nonsentient optimization process, this is less morally problematic than rewriting the source code of a sentient intelligence who doesn't want to be rewritten. Creating new life forms creates such massive issues that it's really better to just not try, at least until we know a lot more.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Nonsentient Optimizers, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.