I had precisely the same reaction about the persistent effects of torture over time when I read the torture vs. dust specks problem.
I think your reply to the original question highlights a difficulty with actual application of utilitarian thought experiments in concrete situations. The question as originally posed involved inflicting disutility on a random target by pushing a button, presumably with the actor and the target being mutually unaware of each other's identities. When you substitute punching someone, even if the target is randomly chosen, the thought experiment breaks down, both because it becomes difficult to predict the actual amount of suffering that is going to result (e.g. the target has a black belt and/or concealed weapon and no sense of humor = more pain than you were expecting), and because of second order effects: a world in which inflicting random pain is considered justified if it produces a greater amount of fun for other individuals is going to be a world with a lot less social trust, reducing utility for everybody.
But, hang on. Grant that there's some amount of disutility from permanent damage caused by torture. Nevertheless, as you add more specks, at some point you're going to have added more disutility, right? Suppose the torture victim lives for fifty years after you're done with him, and he's an emotional and physical wreck for every day of those fifty years; nevertheless, this is a finite amount of disutility and can be compensated for by inserting a sufficient number of Knuth up-arrows between the numerals. Right?
Most of the usual thought experiments that justify expected utilitarialism trade off fun for fun, or suffering for suffering. Here's a situation which mixes the two. You are offered to press a button that will select a random person (not you) and torture them for a month. In return the machine will make N people who are not suffering right now have X fun each. The fun will be of the positive variety, not saving any creatures from pain.
1) How large would X and N have to be for you to accept the offer?
2) If you say X or N must be very large, does this prove that you measure torture and fun using in effect different scales, and therefore are a deontologist rather than a utilitarian?