In information theory and machine learning, information gain is a synonym for Kullback-Leibler divergence. However, in the context of decision trees, the term is sometimes used synonymously with mutual information, which is the expectation value of the Kullback-Leibler divergence of a conditional probability distribution. … Information Gain google