Presumably pain works in some specific way (or some relatively narrow distribution of ways), so there probably is a maximum amount of pain that could be experienced in any circumstance. Real-life animals can and do die of shock, which seems like it might be some maximum 'pain' threshold being exceeded.
But suffering seems much more general than pain. Creating (e.g. simulating) a consciousness or mind and torturing it increases global suffering. Creating multiple minds and torturing them would increase suffering further.
What seems to be different about suffering (to at least some degree – real-life beings also seem to suffer sympathetic pain) is that additional suffering can be, in effect, created simply by informing other minds of suffering of which they were not previously aware. Some suffering is created by knowledge or belief, i.e. spread by information. (This post has a good perspective one can adopt to avoid being 'mugged' by this kind of information.)
The creation or simulation of minds is presumably bounded by physical constraints, thus there probably is some maximum amount of suffering possible.
Are there possible minds that can experience an infinite amount of pain or suffering? I think not. At a 'gears' level, it doesn't seem like pain or suffering could literally ever be infinite, even over an infinite span of time, tho I admit that seems true because it does seem true that, e.g. there's a finite amount of matter in the universe, and that minds cannot exist for an infinite amount of time (e.g. because of the eventual heat death of the universe).
But even assuming minds can exist for an infinite amount of time or that minds could be arbitrarily 'large', I'd expect the amount of pain or suffering that any one mind could experience to be finite. But, under those same assumptions (or similar ones), the total amount of pain or suffering experienced could be infinite.
My point about 'capping' the (dis)utility of pain was that one – a person or mind that isn't a malevolent (super-)intelligence – wouldn't want to be able to be 'held hostage' were something like a malevolent super-intelligence in control of some other mind that could experience 'infinite pain'. You probably wouldn't want to sacrifice everything for a tiny chance at preventing the torture of a single being, even if that being was capable of experiencing infinite pain.
I don't think it's possible, or even makes sense, for a mind to experience an infinite amount/level/degree of pain (or suffering). Infinite pain might be possible over an infinite amount of time, but that seems (at least somewhat) implausible, e.g. given that the universe doesn't seem to be infinite, seems to contain a finite amount of matter and energy, and seems likely to die of an eventual heat death (and thus not able to support life or computation indefinitely).
Even assuming that a super-intelligence could rewire human minds to just increase the amount of pain they can experience, a reasonable generalization is to a super-intelligence creating (e.g. simulating) minds (human or otherwise). That seems to me to be the same (general) moral/ethical catastrophe as your hypothetical(s).
But I don't think these hypotheticals really alter the moral/ethical calculus with respect to our decisions, i.e. the possibility of the torture of minds that can experience infinite pain doesn't automatically imply that we should avoid developing AGI or super-intelligences entirely. (For one, if infinite pain is possible, so might infinite joy/happiness/satisfaction.)