Table 1. Data for skill, skill index, ROC Information, Information from random ROC, Kolbeck-Leibler Divergence [3], and Jensen-Shannon Divergence[4]. The latter two terms refer to evaluating the distance in information space between the ROC from the filtered data and a random ROC curve (diagonal line on the ROC diagram). While both divergence quantities measure the difference in entropy between the two distributions, the Jensen-Shannon is the only one that represents a true metric. The top 4 rows of data in the table are from the California data, whereas the bottom row is from the simulation discussed in the text.