The answer is complex
First of all, the creation of people is a complex moral decision. Whether you espouse average utilitarianism or total utilitarianism or whatever other decision theory, if you ask someone "Would you press a button that would create a person", they'd normally be HESITANT, no matter whether you said it would be a very happy person or a moderately happy person. We tend to think of creating people as a big deal, that brings a big responsibility.
Secondly, my average utilitarianism is about the satisfaction of preferences, not happiness. This may seem a nitpick, though.
Thirdly, I can't help but notice that you're using the example of the creation of a world that in reality would increase average utility, even as you're using a hypothetical that states that in that particular case it would decrease average utility. This feels as a scenario designed to confuse the moral intuition into giving the wrong answer.
So using the current reality instead (rather than the one where people are 9x happier): Would I choose to create another universe happier than this one? In general, yes. Would I choose to create another universe, half as happy as this one? I general, no, not unless there's some additional value that the presence of that universe would provide to us, enough so that it would make up for the loss in average utility.
the creation of people is a complex moral decision
True enough. But it seems to me that hesitation in such cases is usually because of uncertainty either about whether the new people would really have good lives or about their effect on others around them. In the scenarios I described, everyone involved gets a good life when ask their interactions with others are taken into account. So yeah, creating livres is complex, but I don't see that that invalidates my question at all.
preferences, not happiness
That happens to be my, er, preference too. I think...
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?