Arguably, many consequentialists already fall in to this category. If you are unsettled by the image of a universe composed entirely of undifferentiated orgasmium, then it's a fair bet that happiness is not your (only) terminal value. To return to a common sentiment, "I don't want to maximize my happiness, I want to maximize my awesomeness."
That said, happiness is usually of some value to students of ethics. A system in which it had zero value could conceivably still be pretty happy for instrumental reasons, since happiness makes humans more efficient in the pursuit of most things that we can expect to be valued. Once you start creating non-human entities from the ground up, you would expect happiness to become rare, although not necessarily to be replaced with misery. (The paperclipper is such a force.)
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one.
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.