I would not want people to money-pump me from my delusions, even if it were profitable to them, and even if it were a very bad delusion.
Edit: On second thought, I do want people to sell me things, even when they disagree with me on the value of the good. So I admit, I don't immediately have an answer for how to distinguish this from selling someone a bet in a prediction market where you have vastly different beliefs, or even from the ordinary buying-an-apple.
But I still think there's a line, and this clearly crosses it.
Edit 2: To further clarify my intuitions here, I see the potential customers as people who have "contracted a minor case of reason", for lack of a better term. They seek enough logical closure over their beliefs to care about their pets after a rapture. That's evidence that their reflective equilibrium does not lie in going whole hog with the rapture thing, and it is not a mere case of weird preferences, but preferences you can expect to change in a way that will regret this purchase. This puts its closer to the dark arts/akrasia-pump/fraud category.
Your position is extremely anti-correct reasoning, and the opposite of what I expected from the data my robot collected on meeting you.
I want to exchange trivial amounts of USD for large amounts of paperclips from humans (to be moved to the safe zone), thereby paperclip-pumping them. Humans regard this as acceptable, and even encourage it of me, despite their non-appreciation of the merit of paperclips being an extreme delusion on their part.
Humans would feel that their rights were infringed if my USD-for-future-paperclips were voided by a superintelligen...
http://eternal-earthbound-pets.com/Home_Page.html
Providing assurance that pets will be provided for in the event of Rapture.
Having thought it over, I'm OK with the ethics of this service.