What is the best mathematical, intuitive explanation of why entropy maximizes in a uniform distribution? I'm looking for a short proof using the most elementary mathematics possible.
Please no explanation like "because entropy was designed in this way", etc...
https://en.wikipedia.org/wiki/Entropy_%28information_theory%29#Definition
A nonuniform distribution has two arguments with different probabilities. Changing both probabilities to their average increases the entropy.