Are there in fact no ("other") religions which endorse making things better?
I am aware of religious denominations which advocate doing good works as a route to personal salvation, but I honestly can't think of any religious branch I'm aware of which advocates good works on the basis of "For goodness' sake, look at this place, it's seriously in need of fixing up."
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.