A New Method to Handle Conflict when Combining Evidences Using Entropy Function and Evidence Angle with an Effective Application in Fault Diagnosis

Addressing the problem of fusing highly conflicting evidences in Dempster–Shafer theory is one of the most necessary, important, and difficult research directions all the time, and so far we have published two papers related to it. In this paper, another novel method to handle conflict when combining evidences is proposed, where evidence distance, evidence angle, and improved entropy function, three key tools, are used for constructing the final weight of each body of evidence. This newly proposed approach mainly consists of three steps: firstly, both evidence distance and evidence angle determine the initial weight together; secondly, making use of the improved entropy modifies the initial weight to get the final weight; lastly, the classical D-S combination rule will be applied to obtain final fusion results. Still a classical numeric example and a real fault diagnosis application both demonstrate its effectiveness and efficiency, and compared with other current popular methods including two of our previous works, this new approach can converge fast and reduce most uncertainty of decision-making when fusing highly conflicting evidences.


Introduction
In practical applications, most information is collected by sensors. Because of the complexity of the target, the data provided by only one sensor may not be comprehensive enough to reflect the fact. erefore multisensors are needed to produce more data for data fusion. However, the information derived from multiple sensors could be uncertain and even sometimes they are in conflict, which will confuse decision-makers [1][2][3][4][5], and so analyzing and handling this multisource uncertain information comprehensively need paying much attention. Dempster-Shafer theory of evidence (D-S theory) is such a powerful information fusion tool to address problems about uncertain information in intelligent systems [6][7][8][9]. In 1967, D-S theory was first presented by Dempster [1] and then improved by Shafer [10], Dempster's student. Actually, D-S theory is a generalization of traditional probability theory. Now, there are more and more However, this idea is not reasonable at all in practice because all BOEs can not be seen equally important. In 2004, Deng et al. [34] defined the dissimilarity measure to represent the conflict and proposed a weighed combination rule, where the weight average of the masses will be combined for n − 1 times. Next, Han modified Deng's method using the Jousselme distance and information entropy. Later on, we also proposed two ideas to cope with this critical problem [35,36].
In this paper, another novel weighted evidence combination rule is presented by us. In this time, the evidence angle is added to help us construct a more reasonable weight of each BOE. is newly proposed approach is composed of three key steps: firstly, both evidence distance and evidence angle determine the initial weight together; secondly, making use of the improved entropy modifies the initial weight to get final weight; lastly, the classical D-S combination rule will be applied to obtain final fusion results. Still a classical numeric example and a real fault diagnosis application both demonstrate the effectiveness and efficiency of our approach, and the comparison with other current popular methods including two of our previous works is also provided to show that our approach can converge fast and reduce most uncertainty of decisions when handling highly conflicting evidences. e remaining paper is organized as follows: Section 2 starts with preliminaries about D-S theory, evidence distance, evidence angle, and improved belief entropy; Section 3 shows the proposed method in detail; Sections 4 and 5 give a numerical example and an application in faulty diagnosis to demonstrate its effectiveness and the efficiency; In the end, a short conclusion is made.

Preliminaries
In this section, some preliminaries are briefly introduced.

Basics of Evidence eory.
Dempster-Shafer theory of evidence (D-S theory) is made use of to cope with uncertainty information as an efficient mathematical model in intelligent systems [1]. In 1967, the definition of D-S theory was proposed by Dempster and then Shafer developed this theory in 1976 [10].
Let Ω be a nonempty finite set and 2 Ω be the set of all subsets of Ω, denoted 2 Ω � ∅, θ 1 , θ 2 , . . . , θ n , θ 1 , θ 2 , . . . , θ 1 , θ 2 , . . . , θ n }. In D-S theory [10], a basic probability assignment (BPA) is a mapping 2 Ω ⟶ [0, 1]that satisfies the following equations: If m(θ) > 0, θ is called a focal element, and the set of all the focal elements is called one body of evidences(BOE). When there are more than one independent BOE, Dempster's combination rule, equation (3) can be used to combine these evidences: where K � θ 1 ∩ θ 2 �∅ m 1 (θ 1 )m 2 (θ 2 ) stands for the conflict degree, also called normalization constant. What's noted is that the combination rule above makes sense only when the conflict degree m ⊕ (∅) ≠ 1; otherwise, the rule is not meaningful. Here, we give a specific example about combination rule and show the corresponding results in Table 1.
Suppose that the frame of discernment � θ 1 , θ 2 , θ 3 is complete and there are two BOEs listed as follows: In the frame of discernment Ω, there are two BOEs, m 1 and m 2 . When m 1 and m 2 are both reliable, to generate a new BPA, we can apply the following conjunctive rule [37], denoted equation (5). When only one of them is totally reliable and we are not sure about another one, we should apply the disjunctive combination rule to obtain a new BPA as equation (6).
But the classical Dempster's combination rule is not always in force. When BOEs are in high conflict, anti-intuitive results will be generated [35,[38][39][40]. So far, plenty of researchers have studied this problem and proposed different kinds of solutions. To sum up, these solutions can be divided into two main directions. One is to preprocess the bodies of evidence (BOEs) [31,32] and the other is to modify Dempster's combination rule [33,34]. Smets's unnormalized combination rule [41], Dubois and Prade's disjunctive combination rule [42], and Yager's combination rule [31] belong to the last category. ese three alternatives mentioned above are examined and they all proposed a general combination framework. As for the first category, originally, Shafer used the coefficient k to measure the conflict degree between the evidences [2], and then, in 2000 [33], Murphy presented a simple averaging approach, where the arithmetic average of n evidences is calculated and then Dempster's combination rule is utilized for fusion. However, this idea is not reasonable at all in practice because all BOEs cannot be seen equally important. In 2004, Deng et al. [34] defined the dissimilarity measure to represent the conflict and proposed a weighed combination rule, where the weight average of the masses will be combined for n − 1 times. Next, Han modified Deng's method using the Jousselme distance and information entropy [43]. Later on, we also proposed two ideas to cope with this critical problem [35,36].

Evidence Distance.
With D-S theory applying widely, the study about the distance of evidence has attracted more and more interests [44,45]. e dissimilarity measure of evidence can represent the lack of similarity between two BOEs. Multitarget performance assessment is done, such as performance evaluation [46], reliability evaluation [47], conflict evidence combination [34], and target association [48]. Lots of methods of evidence distance are brought up as an appropriate measure of the difference, and several definitions on distance in evidence theory are also proposed, such as Jousselme's distance [49], Wen's cosine similarity [50], Smets's transferable belief model(TBM) global distance measure [48], and Sunberg's belief function distance metric [51]. Among those definitions on the distance of evidence, the most frequently-used is Jousselme's distance [49]. e Jousselme's distance [49] is identified on the basis of Cuzzolin's geometric interpretation of evidence theory [52]. e power set of the frame of discernment 2 Ω is regarded as a 2 N − linear space and a distance and vectors are defined with the BPA as a particular case of vectors. Jousselme's distance is defined as where m i , m j are two BPAs under the frame of discernment Ω and D is a 2 N × 2 N matrix. e element in D is defined as e value inside the BOE vectors m 1 �→ and m 2 �→ and the distance matrix D are given by It follows  Function Mathematical Problems in Engineering 2.3. Evidence Angle. In this section, we describe the conflict or consistency degree between evidences from the perspective of geometry. Actually, each BOE can be regarded as a spatial vector and the angle of any two vectors can be used for characterizing the consistency between BOEs. First, we introduce the pignistic vector angle. For a frame of discernment Ω consisting of n elements, 2 Ω is the set of all subsets of Ω, and it can be denoted as 2 Ω � ∅, θ 1 , θ 2 , . . . , θ n , θ 1 , θ 2 , . . . , θ 1 , θ 2 , . . . , θ n }. In order to correspond to the dimension of space vector, we need to transform the BPA vector into an n-dimensional vector, where the converted n-dimensional vector can be obtained from the pignistic probability function.
Suppose m(θ 1 ) is a BPA in the frame of discernment Ω, then the pignistic probability transformation function BetP m (θ 1 ) is defined in the following equation: where m(ϕ) ≠ 1 and |θ 1 | are the number of elements in θ 1 . en, we make use of the cosine value of the pignistic vector angle between two BOEs to measure their consistency degree, defined in equation (16). e larger the value of cos(m i , m j ) is, the more consistent these two BOEs are.

Example 4.
Assume Ω � θ 1 , θ 2 , θ 3 , and there are two following BOEs we collected. We want to compute the consistency degree between these two BOEs.
First, apply equation (15) to transform these two BPAs to n-dimensional vectors. For en, measure the consistency degree cos(m 1 , m 2 ) between BOE 1 and BOE 2 based on equation (16).
But if a BPA is given, there is no way to measure that uncertainty based on some other main entropies listed in Table 3.
As for such a reason, the Deng entropy [66] is presented to measure the uncertainty of BPA, which is a more significant tool to manage uncertainty than Shannon entropy [59]. e Deng entropy can deal with the uncertainty represented not only by BPA but also by probability distribution. In other words, the Deng entropy is the generalization of the Shannon entropy [6,[67][68][69].
e Deng entropy can be denoted as follows: where F i is a proposition in mass function m and |F i | is the cardinality of F i . Especially, if the belief is only assigned to single elements, the Deng entropy equals with the Shannon entropy [66], denoted However, there is a big shortcoming of the Deng entropy that it can not effectively quantify the difference among different BOEs which are assigned by the same mass value.
For example, there are two BOEs, BOE 1 and BOE 2 , as follows: According to equation (16), the uncertainty measures with the Deng entropy are, respectively, E d (BOE 1 ) � 2.5559 and E d (BOE 2 ) � 2.5559. However, this result calculated by the Deng entropy is counter-intuitive, because though these two BOEs have the same mass value, S1 consists of four targets, denoted as e 1 , e 2 , e 3 , and e 4 , while S2 has only three possible targets, and intuitively, what is expected that S2 should have less uncertainty than S1. at means the entropy value of S1 should be bigger than that of S2. erefore, the Deng entropy cannot quantify this difference and we propose one improved entropy function, which can finish this job very well. e improved belief entropy is defined as follows: where |θ i | denotes the cardinality of the focal element θ i , |θ| is the total number of elements in this BOE, and (|θ i |/|θ|) is used to represent the uncertain information in a BOE that has been ignored by the Deng entropy. Still for two BOEs, BOE 1 and BOE 2 above, we calculate the uncertainty by means of the newly proposed entropy function, and we can get * log 2 ((0.6/2 2 − 1) * (2/3))) � 3.1409. So, this improved belief entropy can effectively quantify the difference even if the same mass values are assigned on different BOEs.
An example of the comparison of the Deng entropy and improved entropy function is given and the results are seen in Table 4.

The Proposed Method
e newly proposed method follows our previous works [35,36]. Suppose n pieces of independent BOEs m i (i � 1, . . . , n) have been collected now. What we need to consider first is to preprocess these BOEs through the following equation: where w i is the corresponding weight degree of BOE m i and WAM(m), the short-writing of weighted average mass, represents the weighted average BPA of n independent BOEs.

Mathematical Problems in Engineering
Referring to Murphy's work [33], we need to combine WAM(m) for n − 1 times by using classical Dempster's rule and then get the final combined results. However, to find an appropriate weight degree w i is not an easy thing. ere have been lots of works related to computing a weight of BOE, such as Deng's work [34]. Based on what have done before, another novel idea comes up. At first, we calculate an initial weight of BOE based on both the evidence distance and evidence angle. e evidence distance represents the dissimilarity between evidences, whereas the evidence angle describes the inconsistency among evidences. en, our previously proposed improved entropy function is used for characterizing the uncertainty of BOE and further modifying the initial weight in order to get a more accurate and reasonable weight. e details of the newly proposed approach are shown below.

Determining Initial Weight Using Pignistic Vector Angle and Jousselme
Distance. At the beginning, apply equations (11) and (16) to respectively calculate the evidence distance d(m i , m j ) and the cosine value of evidence angle cos(m i , m j ) between any two BOEs m i , m j (i, j � 1, 2, . . . , n).
Followed by our previous works and other popular papers in evidence theory, the conflict degree can be used to weigh the evidence. Here, both evidence distance and evidence angle are used to characterize the degree of conflict between the evidences in order to capture the two main aspects that affect the evidence conflicts. e evidence distance describes the dissimilarity between the evidences, whereas the evidence angle represents the inconsistency among the evidences. ese two measures are mutually complementary in a sense, and based on the introduction of the evidence distance and evidence angle in Section 2, the smaller the distance between two BOEs is, the more similar they are, and the bigger the cosine value of evidence angle, the more consistent these two BOEs are. erefore, we construct the similarity measure sim(m i , m j ) between m i and m j like the following equation: en, the support degree of a BOE m i (i � 1, 2, . . . , n) could be defined [34] based on the similarity measure mentioned above: After normalizing the support degree of each BOE, we can get its own initial weight iw(m i ), which is determined by both the evidence distance and evidence angle function, as shown in the following equation: As you can see, this initial weight satisfies n i�1 iw(m i ) � 1. What to do next is just to modify iw(m i ) by means of our previously proposed improved belief entropy [36].

Computing Final Weight on the Basis of Improved Entropy
Function. In accordance with our intuition, one BOE m i with higher support degree sup(m i ) or initial weight iw(m i ) should have less uncertainty. In a similar fashion, the BOE owning more uncertainty must have lower sup(m i ) or iw(m i ). On the ground of the thinking above, we could modify the initial weight iw(m i ) through the following steps: Step 1: make use of our previously proposed improved entropy function in equation (24) to get uncertainty degree u(m i ) of each BOE m i (i � 1, . . . , n), and then, normalizing u(m i ) by using equation (30), the normalized uncertainty measure un(m i )(i � 1, . . . , n) can be obtained.

Experiment
In this section, a classic numerical example is provided to show how to use the new proposed method step by step and meanwhile to demonstrate its efficiency and effectiveness.
Example 6. In a multisensor-based automatic target recognition system, suppose that the frame of discernment Ω � θ 1 , θ 2 , θ 3 is complete and θ 1 is the real target. e system collects the following five BOEs from five different sensors:

(34)
Before getting the similarity measure sim(m i , m j )(i, j � 1, 2, . . . , n) between m i and m j , we need to calculate the evidence distance and the cosine value of evidence angle among BOEs. Tables 5 and 6 separately show the results of d(m i , m j ) and cos(m i , m j ) on condition of different numbers of BOEs. en, the values of similarity degree sim(m i , m j ) between any two BOEs could be obtained by means of equation (27), and the corresponding results are shown in Table 7.
Secondly, based on the calculated similarity degree and equation (28), the support degree of each BOE sup(m i )(i � 1, 2, . . . , n) could be acquired, as listed in Table 8. en, normalizing the support measure sup(m i ), we will get its own initial weight iw(m i ) under different conditions, as listed in Table 9.
Next, measure the uncertainty degree u(m i )(i � 1, . . . , n) of each BOE by the improved entropy function and then make a normalization of u(m i ) to get un(m i ) by virtue of equations (29) and (30). Tables 10 and 11 show the results of uncertainty measure under the condition of specific number of BOEs.
After having its uncertainty degree of each BOE, we can apply (31) and (32) to modify the initial weight iw(m i ) and then the final weight fw(m i )(i � 1, . . . , n) can be calculated well. e results of fw(m i ) are shown in Table 12.
At last, compute the weighted averaged BOE WAM(m) listed in Table 13 and apply the classical Dempster's combination rule [1] to fuse WAM(m) for n − 1 times (n is the total number of BOEs) [33], and then. we can get the final combined results, as shown in Table 14.
In order to demonstrate the efficiency and effectiveness of this newly proposed method, we have made a clear comparison with other current mainly popular combination rules by calculating this numeric example. e comparison results are all shown in Table 15 and Figures 1-4.
As shown clearly in Table 15 and Figures 1-4, the classical Dempster's combination rule [1] cannot handle the fusion of conflicting evidence well and the counter-intuitive combination results will be produced, and with incremental BOEs, Murphy's simple averaging [33], Deng et al.'s weighted averaging [34], and our last work all can get relatively reasonable results, but all of them are inferior to that of our newly proposed method. Most importantly, when we only have limited number of BOEs, this new method can give more convincing results to decision-makers. Moreover, compared with these current mainly popular evidence combination rules, the convergence of ours is best. at is because both the evidence angle and evidence distance can characterize the relation between any two BOEs better and the improved entropy function helps modify the initial weight more efficiently and reasonably, and so the effect of "good" evidence is strengthened extremely and the effect of "bad" one is weakened largely in the final combined results.

Application in Diagnosis Fault
Similar to our previous work [35,36], this newly proposed approach is also applied to fault diagnosis area, and the example we use here cites from that of our previous work [35,36]. Suppose there is a machine which has three gears G 1 , G 2 , and G 3 , and the failure fault modes F 1 , F 2 , and F 3 are three kinds of failure fault modes corresponding to G 1 , G 2 , and G 3 and are collected as fault hypothesis set {F 1 , F 2 , F 3 }. Besides, there are three various sensors, s 1 , s 2 , and s 3 , from which the evidence set m 1 , m 2 , m 3 shown in Table 16 comes.
Here, still two kinds of sensor reliability will be considered. One is the static reliability R s i � μ i × ] i measured by the evidence sufficiency suf(μ) and importance index imp(]), and the other is dynamic reliability R d i measured based on the final weight fw(m i ) proposed newly in this work. e final comprehensive reliability R � R s × R d is used for correcting the highly conflicting evidences, and after combining those evidences, a result with a higher accuracy and more belief will be obtained.
At the beginning, let us compute the static reliability R s of each BOE using the formula R s i � μ i × ] i and the corresponding results are shown in Table 17. e next step is to compute the dynamic reliability R d of each BOE, BOE 1 , BOE 2 , and BOE 3 ; that is, we need to get the final weight fw(m i )(i � 1, 2, 3) by means of our newly proposed method in Section 3.
To get fw(m i ), what we need to do first is to calculate the evidence distance d(m i , m j ), the cosine value of evidence angle cos(m i , m j ), and the similarity measure sim(m i , m j ) between any two BOEs m i , m j (i, j � 1, 2, 3). After having Mathematical Problems in Engineering 7 sim(m i , m j ), the support degree sup(m i ) and the initial weight iw(m i ) of one BOE m i (i � 1, 2, 3) could be obtained. e following tables from Tables 18 to 21 show the results of indexes mentioned above.
Next, based on the improved entropy function, the uncertainty degree u(m i ) and the normalized uncertainty measure un(m i ) of each BOE could be computed well, as shown in Table 22. Finally, make use of equations              Table 24.
At last, modify BPAs with comprehensive sensor reliability RN to get WAM(m), as shown in Table 25, and then, using the classical combination rule to combine WAM(m) for two times, we will get final results listed in Table 26. Dempster's rule [1] Murphy's simple average [33] m(θ 1 ) � 0.0964 Deng's weighted average [34] m(θ 1 ) � 0.0964 Our last work [36] m(θ 1 ) � 0.0846 Our newly proposed method As shown in Table 26, our newly proposed method assigns the fault mode F 1 95.42% of total belief, and the fault mode F 2 only owns 4.84% belief. What we also concern is that the uncertainty of belief m(Ω) has been reduced to 0.0012. In a word, this latest approach can provide a pretty precise combined result for decision-makers. is newly proposed method is also compared with others including two of our previous works and the related comparison results are shown in Table 27 and Figure 5. As seen from Table 27 and Figure 5, we easily find the latest method has significantly more belief degree of fault mode F 1 and less uncertainty of belief than those of Fan and Zuo [70] and our previous works [35,36]. at is because these three efficient tools, evidence distance, evidence angle, and improved entropy, compute dynamic reliability of each sensor more reasonably and comprehensively and reduce the conflicts among BOEs to the maximum extent.   Figure 4: e combination results using different combination rules under condition of five BOEs.

Conclusion
In this paper, a new method to handle conflict when combining evidences is presented. Compared to our previous works [35,36], the evidence angle is added to this new work in order to describe the consistency degree between the evidences. is newly proposed approach consists of three steps: firstly, both the evidence distance and evidence angle determine the initial weight of each BOE; secondly, the improved entropy is used for modifying the initial weight; finally, apply the classical D-S combination rule to get final fusion results. Moreover, one numeric example and one fault diagnosis application sufficiently demonstrate the efficiency and effectiveness of this new method, and the related comparison results show our proposed approach can converge fast and reduce most uncertainty of decision-making when handling highly conflicting evidences, and so it can help experts make a better and faster decision.

Data Availability
All the data used in this study are available within the manuscript.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.