I'm confused by the terminology, but I think I would be a relativist objectivist.
I certainly think that morality is relative -- what is moral is agent-dependent -- but whether or not the agent is behaving morally is an objective fact about that agent's behavior, because the behavior either does or doesn't conform with that agent's morality.
But I don't think the distinction between a relativist objectivist and a relativist subjectivist is terribly exciting: it just depends on whether you consider an agent 'moral' if it conforms to its morality (relativist objectivist) or yours (relativist subjectivist).
But maybe I've got it wrong, because this view seems so reasonable, whereas you've indicated that it's rare.
So you believe that the word morality is a two-place word and means what an agent would want to do under certain circumstances? What word do you use to means what actually ought to be done? The particular thing that you and, to a large degree all humans would want to do under specified circumstances? Or do you believe there isn't anything that should be done other than what whatever agents exist want? Please note that that position is also a statement about what the universe ought to look like.
Less Wrong is extremely intimidating to newcomers and as pointed out by Academian something that would help is a document in FAQ form intended for newcomers. Later we can decide how to best deliver that document to new Less Wrongers, but for now we can edit the existing (narrow) FAQ to make the site less scary and the standards more evident.
Go ahead and make bold edits to the FAQ wiki page or use this post to discuss possible FAQs and answers in agonizing detail.