Tulpa creation is effectively the creation of a form of sentinent AI that runs on the hardware of your brain instead of silicon.
That brings up a moral question. To what extend is it immoral to create a Tulpa and have it be in pain?
Tulpa are supposed to suffer from not getting enough attention so if you can't commit to giving it a lot of attention for the rest of your life you might commit an immoral act by creating it.
Just so facts without getting entangled in the argument: In anecdotes tulpas seem to report more abstract and less intense types of suffering than humans. The by far dominant source of suffering in tulpas seems to be via empathy with the host. The suffering from not getting enough attention is probably fully explainable by loneliness, and sadness over fading away losing the ability to think and do things.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.