I don't think it would be reasonable to develop such a roadmap at this point, given that it would require having a relatively high certainty in a specific plan. But given that it's not yet clear whether the best idea is to proceed on being the first one to develop FAI, or to pursue one of the proposals listed in the OP, or to do something else entirely, and furthermore it's not even clear how long it will take to figure that out, such specific roadmaps seem impossible.
And this vagueness pattern-matched perfectly to various failed undertakings, hence my skepticism.
Related Posts: A cynical explanation for why rationalists worry about FAI, A belief propagation graph
Lately I've been pondering the fact that while there are many critics of SIAI and its plan to form a team to build FAI, few of us seem to agree on what SIAI or we should do instead. Here are some of the alternative suggestions offered so far: