Empathic inference is every-day common mind-reading. It’s an inference made about other person’s mental states using your own brain as reference, by making your brain feel or think in the same way as the other person you can emulate his mental state and predict his reactions1. This method is extremely less costly than trying to model the brain as a physical system and then calculating the expected reaction. It can be used in science, for example in cognitive psychology, it’s standard procedure to use a panel of reviewers ascribing emotional states and expected reactions to other individuals. However, it doesn’t reveal the underlying mechanisms behind the reactions and the emotional states2.
Empathic inference is often wrongly used to predict the behavior of non-human agents in a manifestation of Anthropomorphism. This was commonly applied to forces of nature in ancient myths and religions3. For example, attributing intentionality to hurricanes, saying they are the manisfestation of God's wrath. Today, AGI is often subject to this kind of bias, i.e., some people tend to think a superintelligent AGI would have to be necessarily benevolent or malignant, based on "putting themselves in the AI's shoes". Putting oneself in an AI's shoes cannot be used to evaluate AI behavior because AI will not have human-like motivations unless they are explicitly programmed in 4.