^{1}

^{1}

In the present communication, a parametric (

The concept of intuitionistic fuzzy set (IFS) (Atanassov) [

Pythagorean fuzzy set (PFS), proposed by Yager [

Various researchers theoretically developed the concept of Yager’s Pythagorean fuzzy sets [

Using PFSs, Ren et al. [

Bajaj et al. [

In this communication, we have proposed a new

In this section, we recall and present some fundamental concepts in connection with Pythagorean fuzzy set, which are well known in literature.

An intuitionistic fuzzy set (IFS)

A Pythagorean fuzzy set (PFS)

In case of PFS, the restriction corresponding to the degree of membership

IFS versus PFS.

Some of the important binary operations on PFSs are being presented below which are available in literature.

If

Let

Let

Let

The most important property of this measure is that when

Based on the axiomatic definition of entropy for intuitionistic fuzzy set, proposed by Hung and Yang (2006) [

In context with Pythagorean fuzzy information, we propose the following Pythagorean fuzzy entropy analogous to measure (

The proposed entropy measure

To prove this, we shall show that it satisfies all the axioms PFS1 to PFS4.

Since

Either

These three cases implies that

In Section

Analytically, we prove the concavity of the

It may be noted that

and

because if

Now, since

Therefore, by the above explained result, we conclude that

Similarly, if

Hence,

Let

Divide

For any Pythagorean fuzzy set

By definition, the proof is obvious.

In this section, we carry an empirical study for investigating the maximality and monotonic nature of the proposed

Values of entropy for different values of

| | | | | | | | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

| | | | | | | | | | | | | | | | | | | | | | | | ||

1 | 0.2 | 254.676 | 218.665 | 184.776 | 194.201 | 54.000 | 46.744 | 39.669 | 41.643 | 20.000 | 17.381 | 14.767 | 15.498 | 16.960 | 14.743 | 12.500 | 13.123 | 16.346 | 14.208 | 12.040 | 12.637 | 16.186 | 14.068 | 11.919 | 12.511 |

2 | 0.3 | 147.744 | 127.021 | 107.410 | 112.867 | 21.030 | 18.328 | 15.608 | 16.369 | 5.134 | 4.526 | 3.855 | 4.044 | 4.068 | 3.590 | 3.017 | 3.171 | 3.864 | 3.408 | 2.854 | 2.996 | 3.811 | 3.360 | 2.811 | 2.951 |

3 | 0.5 | 107.670 | 92.610 | 78.329 | 82.303 | 12.000 | 10.497 | 8.952 | 9.385 | 2.000 | 1.785 | 1.495 | 1.577 | 1.464 | 1.309 | 1.030 | 1.096 | 1.366 | 1.218 | 0.943 | 0.998 | 1.341 | 1.193 | 0.920 | 0.973 |

4 | 0.8 | 93.073 | 80.064 | 67.717 | 71.154 | 9.340 | 8.180 | 6.972 | 7.311 | 1.264 | 1.134 | 0.906 | 0.970 | 0.877 | 0.790 | 0.534 | 0.584 | 0.809 | 0.722 | 0.468 | 0.503 | 0.792 | 0.701 | 0.451 | 0.483 |

5 | 1.2 | 86.508 | 74.420 | 62.939 | 66.134 | 8.263 | 7.241 | 6.162 | 6.464 | 1.004 | 0.902 | 0.673 | 0.736 | 0.675 | 0.611 | 0.335 | 0.375 | 0.618 | 0.549 | 0.281 | 0.301 | 0.604 | 0.528 | 0.267 | 0.283 |

6 | 1.7 | 83.052 | 71.448 | 60.420 | 63.489 | 7.727 | 6.773 | 5.753 | 6.038 | 0.884 | 0.796 | 0.552 | 0.614 | 0.584 | 0.529 | 0.236 | 0.261 | 0.532 | 0.467 | 0.192 | 0.192 | 0.519 | 0.444 | 0.179 | 0.176 |

7 | 2.5 | 80.576 | 69.319 | 58.613 | 61.592 | 7.356 | 6.449 | 5.467 | 5.740 | 0.805 | 0.726 | 0.468 | 0.525 | 0.524 | 0.474 | 0.177 | 0.178 | 0.476 | 0.406 | 0.141 | 0.115 | 0.464 | 0.381 | 0.128 | 0.101 |

8 | 5 | 78.100 | 67.190 | 56.806 | 59.692 | 6.996 | 6.133 | 5.188 | 5.446 | 0.731 | 0.659 | 0.393 | 0.433 | 0.469 | 0.413 | 0.135 | 0.103 | 0.424 | 0.327 | 0.102 | 0.050 | 0.413 | 0.296 | 0.087 | 0.037 |

9 | 7 | 77.420 | 66.604 | 56.309 | 59.170 | 6.899 | 6.048 | 5.113 | 5.366 | 0.712 | 0.640 | 0.374 | 0.409 | 0.455 | 0.391 | 0.125 | 0.087 | 0.411 | 0.294 | 0.091 | 0.033 | 0.400 | 0.261 | 0.073 | 0.021 |

10 | 10 | 76.917 | 66.172 | 55.942 | 58.784 | 6.828 | 5.985 | 5.058 | 5.308 | 0.698 | 0.625 | 0.361 | 0.392 | 0.444 | 0.370 | 0.117 | 0.074 | 0.401 | 0.263 | 0.081 | 0.019 | 0.391 | 0.231 | 0.060 | 0.009 |

11 | 20 | 76.339 | 65.674 | 55.520 | 58.340 | 6.746 | 5.912 | 4.995 | 5.241 | 0.682 | 0.605 | 0.346 | 0.372 | 0.433 | 0.340 | 0.107 | 0.059 | 0.390 | 0.224 | 0.067 | 0.005 | 0.380 | 0.203 | 0.035 | 0.001 |

12 | 40 | 76.053 | 65.427 | 55.311 | 58.120 | 6.706 | 5.877 | 4.963 | 5.208 | 0.674 | 0.595 | 0.338 | 0.362 | 0.427 | 0.325 | 0.100 | 0.053 | 0.385 | 0.213 | 0.052 | 0.003 | 0.375 | 0.199 | 0.013 | 0.000 |

13 | 50 | 75.996 | 65.378 | 55.270 | 58.076 | 6.698 | 5.869 | 4.957 | 5.201 | 0.673 | 0.593 | 0.337 | 0.361 | 0.426 | 0.323 | 0.098 | 0.052 | 0.384 | 0.211 | 0.048 | 0.002 | 0.374 | 0.199 | 0.009 | 0.000 |

14 | 70 | 75.931 | 65.322 | 55.222 | 58.026 | 6.689 | 5.861 | 4.950 | 5.194 | 0.671 | 0.591 | 0.335 | 0.358 | 0.425 | 0.320 | 0.096 | 0.050 | 0.383 | 0.210 | 0.044 | 0.002 | 0.372 | 0.199 | 0.005 | 0.000 |

15 | 100 | 75.882 | 65.280 | 55.187 | 57.989 | 6.682 | 5.855 | 4.945 | 5.188 | 0.670 | 0.589 | 0.333 | 0.357 | 0.424 | 0.317 | 0.094 | 0.050 | 0.382 | 0.209 | 0.041 | 0.002 | 0.372 | 0.198 | 0.003 | 0.000 |

16 | 200 | 75.826 | 65.231 | 55.145 | 57.945 | 6.675 | 5.848 | 4.939 | 5.182 | 0.668 | 0.587 | 0.332 | 0.355 | 0.423 | 0.315 | 0.092 | 0.049 | 0.381 | 0.208 | 0.037 | 0.002 | 0.371 | 0.198 | 0.002 | 0.000 |

17 | 500 | 75.792 | 65.202 | 55.120 | 57.919 | 6.670 | 5.844 | 4.935 | 5.178 | 0.667 | 0.586 | 0.331 | 0.354 | 0.422 | 0.314 | 0.091 | 0.048 | 0.380 | 0.208 | 0.036 | 0.002 | 0.370 | 0.198 | 0.001 | 0.000 |

Monotonicity of the

Suppose that there is a set of

When the criteria weights are completely unknown, then we calculate the weights by using the proposed PFS entropy as

In case the weights are partially known for a multiple-criteria decision-making problem, we use the minimum entropy principle (Wang and Wang [

The overall entropy of the alternative

Since there are fair competitive environments between each of the alternatives, the weight coefficient with respect to the same criteria should also be equal. Further, in order to get the ideal weight, we construct the following accompanying model:

Finally, the procedure for implementing the proposed algorithm is being presented using Figure

Flowchart of the proposed algorithm using PFS.

The steps of the proposed methodology are enumerated and detailed as follows.

We construct the decision matrix

Determine the criteria weights by using (

Define the most preferred solution

By using Definition

Determine the relative degrees of closeness

On the basis of the relative degree of closeness obtained in Step

Based on two different cases considered in the proposed algorithm, we present two different examples as follows.

Suppose an automobile company produces four different cars, say,

Calculate the criteria weight vector using (

The most preferred solution

and

respectively.

The distances between each of

The values of relative degree of closeness are as follows:

The ranking of the alternatives as per the relative degree of closeness is

The consistency of the ranking procedure for different values of parameters

Suppose there are 1000 students in a college. On the basis of three selected criteria, say,

Let the information about the criteria weight be partially given in the following form

Using (

By solving this linear programming problem using MATLAB software, we obtained the criteria weight vector as follows:

The most preferred solution

The distances between each of

The values of relative degree of closeness are

The ranking of the alternatives as per the relative degree of closeness is

The consistency of the ranking procedure for different values of the parameters

We have successfully proposed a new parametric

In the area of pattern recognition, the directed divergence measure/symmetric divergence measure explains dissimilarity between pairs of probability distribution which is generally utilized for the procedure of factual surmising. It might be noticed that these difference measures and similarity measures are dual ideas. The similarity measure might be characterized as a diminishing capacity of the difference measures, particularly when the scope of divergence measures is

The proposed parametric

The data for the implementation of the proposed algorithm in the numerical example are hypothetical data and have no connection with any particular agency’s data.

This article does not contain any studies with human participants or animals performed by any of the authors.

The authors declare that they have no conflicts of interest.