In information theory and machine learning, information gain is a synonym for Kullback-Leibler divergence. However, in the context of decision trees, the term is sometimes used synonymously with mutual information, which is the expectation value of the Kullback-Leibler divergence of a conditional probability distribution. … Information Gain
If you did not already know: “Information Gain”
31 Tuesday Mar 2015
Posted What is ...
in