Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states

Abstract
We discuss an alternative to relative entropy as a measure of distance between mixed quantum states. The proposed quantity is an extension to the realm of quantum theory of the Jensen-Shannon divergence (JSD) between probability distributions. The JSD has several interesting properties. It arises in information theory and, unlike the Kullback-Leibler divergence, it is symmetric, always well-defined, and bounded. We show that the quantum JSD shares with the relative entropy most of the physically relevant properties, in particular those required for a “good” quantum distinguishability measure. We relate it to other known quantum distances and we suggest possible applications in the field of the quantum information theory.

This publication has 21 references indexed in Scilit: