Entropic lower bound for distinguishability of quantum states

For a system randomly prepared in a number of quantum states, we present a lower bound for the distinguishability of the quantum states, that is, the success probability of determining the states in the form of entropy. When the states are all pure, acquiring the entropic lower bound requires only the density operator and the number of the possible states. This entropic bound shows a relation between the von Neumann entropy and the distinguishability.


introduction
Quantum mechanics does not allow to determine the state of a system by measuring a single copy of the ensemble. Nevertheless, if some prior information is known, it is possible to guess the state with a certain degree of confidence even by a single measurement. Given some prior information, to what extent quantum states can be distinguished is an intriguing issue from both fundamental and practical points of view. For example, this problem is closely related to efficiencies of quantum communication [1][2][3]. It is known that the imperfect distinguishability plays a crucial role in the security of quantum cryptography [3].
There are different approaches to the distinguishability of quantum states [4][5][6][7]. In the minimum-error discrimination problem [4], a set of known quantum states and preparation probabilities are given, and one aims to distinguish the states with the optimal probability of success. The optimal success probability is an operationally welldefined measure for the distinguishability of given states. However, it is a highly demanding quest to find its analytical solution for general sets of states, and the solution is only known for the sets of two states [1]. Instead, upper bounds [8][9][10][11][12] and lower bounds [13][14][15] have been provided to estimate the optimal probability. On the other hand, there have been studies of distinguishability between unknown quantum states using programmable machines [6,7]. In this case, ancillary systems prepared in each of the unknown states are provided as an input of the machine.
One may pay attention to the von Neumann entropy as a quantity related to the distinguishability in light of the capacity of a quantum state for embodying quantum information. When a system is probabilistically prepared in one of a certain number of quantum states, its state of the statistical mixture is described by a density operator. According to the quantum source theorem [16], the von Neumann entropy of the system, which is given as a function of the density operator, represents the capacity of the mixed state to (asymptotically) carry quantum information. As discussed in Ref. [17], one may relate this capacity to the concept of distinguishability because more information could be carried when each state is more distinguishable. For this reason, Jozsa and Schlienz considered the von Neumann entropy as a measure for distinguishability [17]. However, it is not known whether this kind of distinguishability is linked to the actual ability to distinguish quantum states by measurements, namely the success probability (distinguishability will henceforth refer to the success proabability). It seems that, at least, the von Neumann entropy cannot pinpoint the success probability of distinguishing given quantum states. This is because the von Neumann entropy is determined only by the density operator, and the density operator of a system can take arbitrarily many decompositions in general; thus it does not contain information on which states the system could have been prepared in.
We here present a lower bound for the distinguishability, i.e., the optimal success probabilities of distinguishing between quantum states, as a function of entropies of the system. For a system prepared in one of N pure states, we also present a reduced form of the entropic bound which requires the density operator and the number of possible states N for its evaluation. It reveals a relation between the von Neumann entropy and the distinguishability; the larger von Neumann entropy guarantees the better distinguishability.

General formulation
Consider a quantum system prepared in one of N quantum states with some probabilities. We denote them by . One wishes to identify which state the system has been prepared in or, equivalently, to identify the value of x. The value of x is determined, using a generalized measurement described by the measurement operator {M x } N x=1 . Therefore, the probability of correctly identifying x for a given x is Tr[M x ρ x ], and the expected success probability is When maximized over all measurements, it becomes the optimal success probability, which is denoted by P * s . It is the quantity which we consider as the degree of the distinguishability of quantum states. As already mentioned, however, the analytical form of P * s is known only for the two-state case [1].
An equivalent way of describing the scenario is to consider a classical-quantum system XQ in the state, where the indices x's are encoded. Namely, one is given the quantum system Q and wishes to determine the value of x by measuring Q. In terms of entropic quantities, X has uncertainty quantified by Shannon entropy H(X) = − p x log p x , and it has a correlation with the quantum system. One may expect an entropic lower bound from the intuition that the correlation of Q with X would enhance the distinguishability of the quantum states. For better understanding, let us first consider a fully classical case [18], where the quantum system Q is replaced with a classical system Y and ρ x = y p(y|x)|y y|. Assume that we are given Y and wish to determine the value of x from y. For a given y, the most probable x is the one that gives the maximum conditional probability, max x p(x|y). Therefore, the optimal success probability is attained by choosing them for all y, and it is given as P * s = y p(y) max x p(x|y). It can be lower bounded in terms of the correlation between X and Y as follows.
The first inequality follows by taking the average of p(x|y) over p(x|y), and the second one follows from the concavity of the exponential function. The exponent in the last line is equal to the conditional Shannon entorpy H(X|Y ) so that P * where H(X : Y ) is the classical mutual information between X and Y . Therefore, with assistance from the random variable Y having the amount of correlation H(X : Y ), one can guess the random variable X with probability (in logarithm) at least −H(X) + H(X : Y ).

Entropic lower bound for the distinguishability of quantum states
For the quantum case, we can still obtain a random vari-ableỸ by applying a measurement (i.e., the outcome of the measurement can be considered as a random variable), and applying (3) [19]. In terms of the quantum entropies, we present the quantum entropic lower bound.
Theorem. For a set of quantum states with preparation probabilities {p x , ρ x } N x=1 , the optimal success probability of distinguishing the quantum states, P * s , is lower bounded as Proof. For the proof, we employ the conditional minentropy [20], which has many applications in quantum cryptography. The conditional min-entropy S min (A|B) of a system AB in a state ρ AB is defined as where π B denotes the set of all quantum states of the subsystem B. We derive the lower bound (4) from two properties of the conditional min-entropy. First, the conditional min-entropy has an operational meaning that for the classical-quantum states in (2), the logarithm of P * s is equal to the negative conditional min-entropy (see Ref. [21] for the details). Therefore, for the classical-quantum state in (2), log P * s = −S min (X|Q).
It has also been shown in Ref. [22] that the conditional min-entropy is always less than or equal to the conditional von Neumann entropy, so we have S min (A|B) S(A|B).
Using Eqs. (5) and (6), we obtain log P * s −S(X|Q), which is equivalent to the first line in (4). On the other hand, the conditional von Neumann entropy satisfies the chain rule, S(X|Q) = S(X) − S(XQ). It enables the lower bound to be written as a function of entropies of the system Q. The von Neumann entropies of XQ and X are evaluated as S(XQ) = S( p x ρ x ) − p x S(ρ x ) and S(X) = H( p). It then follows that S(X|Q) = H( p) − S( p x ρ x ) + p x S(ρ x ), and this gives the second line in (4).
Let us take a closer look into the form of the lower bound. Its form is exactly of (3), but only H(X : Y ) is replaced with I(X : Q). The quantum mutual information is known to capture all the correlations including both classical and quantum parts [23,24], so it is considered as a measure of the total correlations. Hence, we see that the lower bound is increased by the total correlation. For the classical-quantum state σ XQ , we obtain I(X : Q) = S( p x ρ x )− p x S(ρ x ), and Holevo's theorem [25,26] implies that H(X :Ỹ ) S( p x ρ x )− p x S(ρ x ) = I(X : Q) for any measurement on the system Q (Ỹ is the outcome of a measurement on Q). Their minimum difference between I(X : Q) and H(X :Ỹ ), i.e., I(X : Q) − max H(X :Ỹ ), is equal to quantum discord [27] of σ XQ .
On the assumption that the system is prepared in one of N pure states {p x , ψ x } N x=1 , the entropic bound can be reduced to a form that only requires the density operator of the system and the number of the possible states for its evaluation. Using H( p) log N and S(ψ x ) = 0, we have Therefore, we see that the larger von Neumann entropy entropy guarantees the better distinguishability. Notwithstanding the missing information on the component states and the preparation probabilities, the density operator of a system alone can provide a lower bound for distinguishability of N pure states. Note that when many copies of quantum systems are prepared in the same way, the density operator can be obtained using the state tomography, but it is impossible to guess the component states. The state-discrimination machine with unknown quantum states as an input [6,7] is the case where it is required to distinguish between unknown quantum states.

Other lower bounds and examples
In this section, we compare the entropic lower bound to other previously known bounds. One is the lower bound given by the square-root measurement [13], and it is known to be optimal for many cases in which the solutions are known [4]. The measurement operators of the square-root measurement {π i } N i=1 are given aŝ where ρ = x p x ρ x . Therefore, a lower bound by the square-root measurement is given as Another one is the pairwise-overlap bound. For an ensemble of pure states {p x , ψ x } N x=1 , it has been given in Ref. [14] as This was derived as a lower bound for the square-root measurement bound, so it is always less than or equal to the bound by the square-root measurement. It provides an analytic form of lower bounds in terms of the pairwise overlaps.
We now consider a few exemplary sets of pure states {ψ x } N x=1 and compare the entropic bound in (8) with the other two lower bounds. Let us first look at three 3dimensional pure states with equal probabilities, The entropic lower bound and the other two bounds are calculated and plotted in Fig. 1(a). In Fig. 1(b), we present the bound values with another set of states where only |ψ 2 is replaced with (|0 + |1 )/ √ 2 from the previous case. As shown in Fig. 1, for both the sets of states, the square-root measurement provides the tightest bounds. The entropic bound is shown to be greater than the pairwise-overlap bound for large regions.
The next example of the component states is four 2dimensional states with equal probabilities |ψ 1 = |0 , |ψ 2 = sin θ|0 + cos θ|1 , which satisfy ψ 1 |ψ 3 = ψ 2 |ψ 4 = 0. In this case, the optimal success probability is known as 1/2 for any 0 θ 2π [28]. The entropic bound gives 1/2, and the other two lower bounds also give 1/2. Finally, we consider a discrimination problem where the component states are not given, but the density operator and the number of possible states are given. Assume that a system is prepared in one of N n-dimensional unknown states, but the system is described by the density operator ρ = 1/n. There are arbitrarily many possibilities of choosing component states and their preparation probabilities in constructing the density operator. For instance, in the case of N = 4 and n = 2, we see that the four states in (12) with p 1 = p 3 = q/2 and p 2 = p 4 = (1 − q)/2 give rise to the same density operator ρ = 1/2 for any 0 q 1 and 0 θ 2π. In this case, the square-root measurement cannot be specified, so one needs to take a minimization over all decompositions of the density operator to obtain a lower bound. However, the entropic bound provides P * s N −1 2 S(1/n) = 1/2 regardless of the component states and the preparation probabilities.

Conclusion
We have presented an entropic lower bound for the optimal success probability of distinguishing quantum states. It provides a connection between the optimal discrimination probability and quantum entropy, i.e., between a practically relevant quantity and a primary function in quantum information theory. When the quantum states are all pure, the entropic bound is reduced to a form that requires less information for its evaluation, namely, the density operator and the number of the possible states. It shows that the von Neumann entropy can lower bound the distinguishability of N pure states.