'singleton' as I've seen it used seems to be one possible Singularity in which a single AI absorbs everyone and everything into itself in a single colossal entity. We'd probably consider it a Bad Ending.
See Nick Bostrom (2005). What is a Singleton?
A singleton is a more general concept than intelligence explosion. The specific case of a benevolent AGI singleton aka FAI is not a bad ending. Think of it as Nature 2.0, supervised universe, not as a dictator.
We have a sample of one modern human civilization, but there are some hints on how likely it was to happen.
Major types of hints are:
Data for:
Data against:
To me it looks like life, animals with nervous systems, Upper Paleolithic-style Homo, language, and behavioral modernity were all extremely unlikely events (notice how far ago they are - vaguely ~3.5bln, ~600mln, ~3mln, ~200k or ~600k, ~50k years ago) - except perhaps language and behavioral modernity might have been linked with each other, if language was relatively late (Homo sapiens only) and behavioral modernity more gradual (and its apparent suddenness is an artifact). Once we have behavioral modernity, modern civilization seems almost inevitable. Your interpretation might vary of course, but at least now you have a lot of data to argue for your position, in convenient format.