This article discusses the use of Jensen Shannon divergence (JS Div) in drift monitoring for machine learning systems. It explains how this metric is based on information theory, and how it is better than other metrics like Kullback-Leibler divergence (KL Divergence). It also provides insight into the advantages of using JS Div for drift analysis in production ML systems.
