Complex Entropy and Its Application in Decision-Making for Medical Diagnosis

In decision-making systems, how to measure uncertain information remains an open issue, especially for information processing modeled on complex planes. In this paper, a new complex entropy is proposed to measure the uncertainty of a complex-valued distribution (CvD). The proposed complex entropy is a generalization of Gini entropy that has a powerful capability to measure uncertainty. In particular, when a CvD reduces to a probability distribution, the complex entropy will degrade into Gini entropy. In addition, the properties of complex entropy, including the nonnegativity, maximum and minimum entropies, and boundedness, are analyzed and discussed. Several numerical examples illuminate the superiority of the newly defined complex entropy. Based on the newly defined complex entropy, a multisource information fusion algorithm for decision-making is developed. Finally, we apply the decision-making algorithm in a medical diagnosis problem to validate its practicability.

One successful alternative uncertain information measure is Gini entropy [24], which is simple to implement and has received a substantial amount of attention from researchers. Inspired by Gini entropy [24], Yager and Petry [25] recently devise an intelligent quality-based approach for fusing multisource information [26]. Bouhamed et al. [27] extend it to combine multisource possibilistic information. Later, researchers generalized the Gini entropy-based information quality to belief functions to measure uncertainty. e method of Li et al. [28,29] is an example that has been well applied in various fields. Although Gini entropy [24] can be used to measure uncertainty, it can only be used for probability distributions. e complex-valued model has potential expressional properties, especially for the modeling of uncertainty [30,31]. erefore, the complex-valued model was widely investigated and applied in various fields, such as medical diagnosis [32], decision-making [33,34], and predicting interference effects [35,36]. Given that the complex-valued representation model is well suited for certain applications, how can Gini entropy be generalized to complex planes to provide a more powerful capability to measure uncertainty?
In this paper, to address the abovementioned issue, a generalized entropy is proposed for measuring the uncertainty of CvDs. When CvDs reduce to probability distributions, the newly defined entropy degrades into Gini entropy. Specifically, vector expressions of CvDs are first proposed to model knowledge in complex planes. After that, a novel complex entropy called Xiao entropy is defined to measure uncertainties of CvDs. en, the properties of complex entropy, including nonnegativity, maximum and minimum entropies, and boundedness, are analyzed and discussed. Based on the newly defined complex entropy, a multisource information fusion algorithm for decision-making is devised. Finally, we apply the decision-making algorithm in a medical diagnosis problem to verify its practicability. e contributions of this work are summarized as follows: A novel complex entropy, called Xiao entropy, which has the properties of nonnegativity, maximum and minimum entropies, and boundedness, is defined for the CvD e multisource information fusion algorithm based on the newly defined entropy can be well applied to support decision-making is study provides a new perspective of complexvalued representation for uncertain information and offers a promising and generalized solution in terms of uncertainty measurements e preliminaries are introduced in Section 2. In Section 3, CvD vectors are defined. In Section 4, a complex entropy is defined to measure the uncertainty of CvDs. In Section 5, several numerical examples illustrate the properties of complex entropy. In Section 6, an algorithm for decision-making is designed on the basis of the newly defined entropy. en, the decision-making algorithm is used in a medical diagnosis. Section 7 concludes this work.

Preliminaries
In this section, some essential concepts of uncertainty measures related to this work are introduced.

Vector Representation of CvD
Modeling uncertainty has attracted a substantial amount of attention in a variety of areas [38]. Many methods have been proposed and applied in various fields, such as failure and risk analysis [39], classification [40,41], information fusion [42], and decision-making [43,44]. Here, a vector representation of CvD is presented for expressing uncertainty in a complex plane. In addition, the norm of CvD is also defined and analyzed.
Definition 3 (CvD vector). Let C k be a CvD vector on the frame of discernment (FOD) Ψ � ψ 1 , . . . , ψ j , . . . , ψ n , denoted by where c kj is the complex value with regard to the occurrence of ψ j : where a kj and b kj are real numbers and i is the imaginary unit, satisfying where |c kj | is the modulus of c kj . Equation (4) is also expressed as follows: with where r kj � |c kj | ≥ 0 and θ kj ∈ [− π, π] denotes an angle (phase) of c kj .
Definition 4 (norm of CvD). Let C k be a CvD vector on FOD Ψ. Norm of CvD vector, ‖C k ‖, is defined by Consider properties of CvD vector in Definition 3, where for each c kj , a 2 kj + b 2 kj ∈ [0, 1] and n j�1 |c kj | � 1, we observe the following: e maximal value of ‖C k ‖, denoted by max[‖C k ‖], is generated, when c kj � 1, for one j, c kj � 0, for others j, such that 2 Journal of Healthcare Engineering Case 2. When c kj degrades into real numbers, i.e., c kj � a kj (b kj � 0), the minimum value of ‖C k ‖, denoted by min[‖C k ‖], is generated, when where In summary, where ‖C k ‖ has a maximum value of 1 with c kj � 1 for one ψ j and others c kj � 0; ‖C k ‖ has a minimal value of 1/ � n √ with all c kj � (1/n).
Definition 5 (complex entropy). Let C k be a CvD vector on FOD Ψ. e complex entropy of C k , denoted as E X (C k ), is defined as When a CvD reduces to a probability distribution, where b kj � 0 and c kj � a kj , then E X (C k ) can be expressed as follows: which is equal to equation (1).
Property 1. E X is a generalized model of Gini entropy [24]. Specifically, when a CvD becomes a probability distribution, E X degrades into Gini entropy [24].
According to equation (13), because such that Remark 1. Notably, the larger E X (C k ) is, the larger the uncertainty in CvD C k is, which results in lower certainty.
Definition 6 (the completely certain CvD). CvD C k is completely certain when E X (C k ) � 0.

Definition 7 (the completely uncertain CvD).
CvD C k is completely uncertain when E X (C k ) � 1.
Theorem 1. E X has the desired properties of the entropy of the CvD, including nonnegativity, maximum and minimum entropies, and boundedness.
Property 2. Let C k be an arbitrary CvD: Proof. e proofs are trivial.

Numerical Examples
In this section, several examples are presented to illustrate the entropy for CvD.
In Example 1, C changes as parameter x varies, where x is set within [0,1], such that C reduces to a probability distribution.
By leveraging the Gini entropy G and Xiao entropy E X , the corresponding entropy measures are shown in Figure 1.
Clearly, E X is the same as G entropy, which verifies that when a CvD reduces to a probability distribution, E X degrades into Gini entropy. Additionally, when x � 0 or x � 1, such that C � [0, 1] or C � [1,0], G(C) and E X (C) achieve the minimum entropy of 0, because in this case, C is the completely certain CvD. By contrast, only when x � 0.5, Journal of Healthcare Engineering 3 such that C � [0.5, 0.5], can G(C) and E X (C) achieve a maximum entropy of 0.5.
In Example 2, C changes as modulus r varies, where r is set within [0.01,0.99].
Because C consists of complex numbers, Gini entropy is not applicable. e result of E X entropy is shown in Figure 2. As r increases from 0.01 to 0.5, E X entropy increases from 0.0198 to 0.5, while as r increases from 0.5 to 0.99, E X entropy gradually decreases to 0.0198. is result shows a similar trend as the entropy measures in Figure 1.
A comparison of the results in Examples 1 and 2 shows that the proposed E X entropy is a more capable uncertainty measure than Gini entropy.
In Example 3, we set six different scales of α, namely, α ∈ [1, 10], [1, us, when a CvD becomes a completely certain distribution, i.e., a probability distribution, in which c kj � a kj � 1 for one j and other c kj � 0, it has a minimum entropy of min[E X (C)] � 0. On the other hand, when α ⟶ + ∞, max[E X (C)] is close to 1, because in this case C is completely uncertain.

Example 4.
Assume that there is a CvD C in the FOD Ψ � ψ 1 , . . . , ψ j , . . . , ψ x : In Example 4, C changes as r and ξ vary. Here, we set r within [0,1] and ξ within [-1,1], as shown in Figure 4(a). e entropy measure of E X (C) is presented in Figure 4(b), which shows how the variations in the modulus and angle of the elements in C impact E X (C).
E X (C) changes as r varies, whereas the variation in angle θ � ξπ has no effect on E X (C).
is result is reasonable because r 2 kj � |c kj | 2 � a 2 kj + b 2 kj is related to the modulus r rather than θ.

Example 5. Consider Example 2.
In Example 5, r is set within [0,1]. We compare the proposed E X with related works, that is, Pennecchi and Oberto's uncertainty measures 1 − 〈‖ c → ‖〉 a and 1 − 〈‖ c → ‖〉 b . By comparing the results of E X , 1 − 〈‖ μ → ‖〉 a , and 1 − 〈‖ μ → ‖〉 b shown in Figure 5, we can see that 1 − 〈‖ c → ‖〉 a remains 0.5 and cannot accurately measure the uncertainty. However, 1 − 〈‖ μ → ‖〉 b provides a better measure of the uncertainty compared to 1 − 〈‖ c → ‖〉 a because as r increases from 0.01 to 0.5, it increases from 0.2929 to 0.4208, while as r increases from 0.5 to 0.99, it gradually decreases to 0.2929. Nevertheless, the proposed E X has better discrimination as an uncertainty measurement and is superior to other methods.

Algorithm and Application
How to deal with decision-making problems has attracted much attention [61][62][63][64][65], especially for complex-valued expressed information [66,67]. In this section, we first design a multisource information fusion algorithm for decision-making based on the proposed entropy. en, we apply the decision-making algorithm in medical diagnosis to validate its practicability.

A Multisource Information Fusion Algorithm for
Decision-Making. Problem statement: let Ψ be a FOD with a set of objectives ψ 1 , . . . , ψ j , . . . , ψ n to be recognized. Suppose there are t CvDs: [c k1 , . . . , c kj , . . . , c kn ] and c kj � a kj + b kj i. e decision-making algorithm is to identify the target from ψ 1 , . . . , ψ j , . . . , ψ n by combining multiple CvDs e specific steps are given as follows: Step 1: For 1 ≤ k ≤ t, its corresponding entropy of CvD C k , denoted by E X (C k ), can be generated as follows: Step 2: For 1 ≤ k ≤ t, its corresponding information volume of CvD C k , denoted by IV(C k ), can be measured by Step 3: e information volume IV(C k ) is normalized by Step 4: According to the normalized information volumes, the weighted average CvD, denoted as C, is defined by where |C k | � [|c k1 |, . . . , |c kj |, . . . , |c kn |] and |c kj | � ������� � x 2 kj + y 2 kj .
Step 5: C is fused via the complex Dempster's combination rule [68] by t − 1 times: Step 6: For C t− 1 (ψ j ), the ψ δ with the maximum absolute value is chosen: Step 7: Let λ be a threshold value for decision-making, which can be set in advance according to specific applications. If C t− 1 (ψ δ ) ≥ λ, the ψ δ can be identified as the target by If C t− 1 (ψ δ ) < λ, it cannot be determined. e corresponding pseudocode is given in Algorithm 1.

Application in Medical Diagnosis.
In this section, the proposed decision-making method is applied in medical diagnosis to demonstrate its practicability. e scenario and data of the application are based on [32]. Considering a medical diagnosis problem, where for a patient P, P suffers with the most possible disease from D � D 1 : viral fever, D 2 : malaria, D 3 : typhoid, D 4 : stomach problem }. To clarify which disease the patient may suffer, five experts diagnose the patient's condition, in which the evaluation data are modeled as CvDs in Table 1. e threshold λ is set as 0.80 for this application to make a decision. We try to diagnose the patient P by integrating the evaluations from the five experts. en, the decision-making algorithm is applied to medical diagnosis by the following steps: Step 1: e entropy values of CvD C E k (1 ≤ k ≤ 5) are calculated by equation (22), as shown in Table 2.
Step 2: e information volumes of CvD C E k (1 ≤ k ≤ 5) are calculated by equation (23), as shown in Table 2.
Step 4: e weighted average CvD C is generated by equation (25), as shown in Table 3.
Step 5: By gradually fusing the weighted average CvD with 4 times, their corresponding results are generated by equation (26), as shown in Table 3.
Step 6: e maximal absolute value of C t− 1 (D j ) is marked with the correct color in Table 3. Step 7: Patient P is diagnosed as most likely to suffer the disease D 1 : 6.3. Discussion. As shown in Table 1, we see that 55, which all support viral fever: D 1 disease.
Input: A FOD Ψ � ψ 1 , . . . , ψ j , . . . , ψ n ; A set of CvDs:C � C 1 , . . . , C k , . . . , C t (1) for 1 ≤ k ≤ t do (2) Calculate the CvD entropy E X (C k ) by equation (22) (3) Measure the CvD information volume IV(C k ) by equation (23) (4) end (5) Calculate the normalized information volume IV(C k ) by equation (24) (6) Generate the weighted average CvD C by equation (25)  (7) Obtain the fused C t− 1 via the complex Dempster's combination rule by equation (26) (8) Choose the maximum absolute value C t− 1 (ψ δ ) by equation (27) However, |C E 2 (D 3 )| � 0.6 supports malaria: D 2 disease. Hence, C E 2 conflicts with C E 1 , C E 3 , C E 4 , and C E 5 . By only using Table 1, it is difficult to make an accurate decision because a conflict exists among the experts. It is necessary to fuse the data collected from different experts to better support decision-making. ere are five evaluations from five experts. To illuminate the effectiveness of the proposed decision-making algorithm, we gradually fuse the weighted average CvD, and the results are given in Table 3. When the weighted average CvD is fused by 1 time, we obtain the result that C 1 (D 1 ) has the largest value of 0.6435. Because 0.6435 is smaller than the threshold λ � 0.80, the patient's disease cannot be determined. When the weighted average CvD is fused by 2 times, it is calculated that C 2 (D 1 ) has the largest value of 0.8034. Because 0.8034 is larger than the threshold λ � 0.80, the patient is diagnosed with viral fever: D 1 . When the weighted average CvD is fused by 3 and 4 times, it is easy to see that C 3 (D 1 ) and C 4 (D 1 ) have increasingly large values of 0.9011 and 0.9525 to better support decision-making. Finally, the patient is diagnosed as most likely to suffer viral fever: D 1 . Consequently, the value in terms of disease D 1 is increased for decision-making from 0.6435 to 0.8034 to 0.9011 and then to 0.9525 as shown in Figure 6. As a result, the proposed decision-making algorithm is effective to address medical diagnosis problem.

Conclusions
In this paper, a complex entropy, called Xiao entropy, is proposed to measure the uncertainty of complex-valued distributions (CvDs). e complex entropy is a generalized model of Gini entropy. Specifically, when the CvD turns into a probability distribution, the proposed entropy degrades into Gini entropy. Furthermore, we study the properties of complex entropy, including nonnegativity, maximum and minimum entropies, and boundedness.
Several numerical examples compare the proposed complex entropy with related works. e results illuminate the superiority of the proposed complex entropy. Based on the complex entropy, a multisource information fusion algorithm for decision-making is devised. Finally, we apply the decision-making algorithm in a medical diagnosis problem to validate its practicability. e main contributions are that this study provides a new perspective of complex-valued representation for uncertain information; the newly defined complex entropy has a powerful capability to measure uncertainty. Additionally, it offers a promising application in decision theory. In the future work, we intend to apply this complex entropy to handle more complex decision-making problems, such as the analyzing and processing of image and physiological signals. Table 2: e results in terms of entropy, information volume, and normalized information volume.

Results
CvDs   Data Availability e data used to support the findings of this study are provided in the article.

Conflicts of Interest
e author states that there are no conflicts of interest.