<ul data-eligibleForWebStory="true">Entropy in a system is maximized when all possible states are equally likely.Conversely, entropy is minimized when one outcome is certain to occur.Moving towards uniformity increases entropy, while moving away from uniformity decreases entropy.Entropy of a system with probabilities pi is defined by a specific formula.If one probability is 1 and the rest are 0, entropy is 0.When all probabilities are equal (pi = 1/n), entropy is log2(n), which is the maximum entropy.The entropy function is concave, leading to entropy decreasing as movements away from uniformity occur.Approaching the global maximum entropy of equal state probabilities results in monotonic entropy increase.