Given a random variable with probability distribution , its entropy is given by
Due to the Mutual Information definition, one can say that the entropy is a measure of self-information of .
Given a random variable X with probability distribution P, its entropy is given by
H(X)≡−\sumlimxP(x)logP(x)Due to the Mutual Information definition, one can say that the entropy is a measure of self-information of X.