Multigranulation rough set is an extension of classical rough set, and optimistic multigranulation and pessimistic multigranulation are two special cases of it. β multigranulation rough set is a more generalized multigranulation rough set. In this paper, we first introduce fuzzy rough theory into β multigranulation rough set to construct a β multigranulation fuzzy rough set, which can be used to deal with continuous data; then some properties are discussed. Reduction is an important issue of multigranulation rough set, and an algorithm of granular space reduction to β multigranulation fuzzy rough set for preserving positive region is proposed. To test the algorithm, experiments are taken on five UCI data sets with different values of β. The results show the effectiveness of the proposed algorithm.
1. Introduction
Qian et al. [1–3] proposed a multigranulation rough set, which is constructed on a family of granular structures and is different from Pawlak’s rough set [4–7]. Qian’s multigranulation rough set can be used to approximate an unknown concept through a family of binary relations; each binary relation can generate a granulation space, which may be partition [1], covering [8, 9] or even neighborhood system [10–12] on the universe of discourse.
Qian’s rough set includes two basic models, one is optimistic multigranulation rough set and the other is pessimistic multigranulation rough set. The word “optimistic” means that at least one of the granulation spaces can be used for approximating while the word “pessimistic” means that all of the granulation spaces should be used for approximating. In these two models, all of the binary relations, or granulation spaces, are presented simultaneously; therefore, optimistic and pessimistic are two special cases of multigranulation rough set. To get a more suitable model for practical application, Xu et al. [13] proposed a more generalized multigranulation rough set, called β multigranulation rough set that designed by a threshold β for controlling the number of the equivalence classes, which are contained in the target.
In recent years, the multigranulation approach has attracted many researchers’ attention [14–18]. Xu et al. generalized multigranulation fuzzy rough sets to tolerance approximation space to construct optimistic and pessimistic multigranulation fuzzy rough sets models [14]. Qian et al. further generalized their optimistic multigranulation rough set into incomplete information system [15]. In [16] Yang et al. introduced fuzzy theory into multigranulation rough set, which employed the T-similarity relations (reflexive, symmetric, and T-transitive) to construct the multigranulation fuzzy rough sets. Therefore, how to generalize multigranulation rough set is an important research field, as we all know how to introduce the fuzzy case to rough set model plays an important role in the development of rough set theory; fuzzy rough set [19] has attracted increasing attention from the domains of machine learning and intelligence data analysis. So, it is not difficult to introduce fuzzy rough set theory into β multigranulation rough set.
Granular space reduction is an important issue of multigranulation rough set and it is recently researched by many scholars [20–24]. In this paper, we focus on the problem to deal with β multigranulation rough set. Hu et al. in [25] proposed a fuzzy-rough attribute reduction. Motivated by this idea, we will introduce fuzzy rough set into β multigranulation rough set to construct β multigranulation fuzzy rough set model and design an algorithm of granular space reduction to β multigranulation fuzzy rough set. The algorithm can be used to granular space reduction in multigranular structures for preserving positive region of β multigranulation fuzzy rough set and will be very useful in big continuous data.
The purpose of this paper is to further generalize β multigranulation rough set to fuzzy environment. To facilitate our discussion, we first present some basic knowledge of rough set in Section 2. In Section 3, β multigranulation fuzzy rough set will be constructed and the properties will be discussed. In Section 4, an algorithm of granular space reduction to β multigranulation fuzzy rough set will be proposed and experiments are taken on five UCI data sets. In Section 5, conclusion is made.
2. Preliminaries2.1. Rough Sets
Formally, a decision system is an information system I=〈U,AT∪D〉, in which U is a nonempty finite set of objects called the universe of discourse and AT is a nonempty finite set of the condition attributes; D is the set of the decision attributes and AT∩D=⌀.
For all x∈U, let us denote by a(x) the value that x holds on a(a∈AT). For an information system I, one then can describe the relationship between objects through their attributes’ values. With respect to a subset of attributes such that A⊆AT, an indiscernibility relation IND (AT) may be defined as
(1)IND(AT)={(x,y)∈U2:a(x)=a(y),∀a∈AT}.
The relation IND(AT) is reflexive, symmetric, and transitive; then IND(AT) is an equivalence relation.
Definition 1.
Let I=〈U,AT∪D〉 be a knowledge base in which A⊆AT, for all X⊆U, the lower and upper approximations of X in terms of the equivalence relation IND(AT) are denoted by AT_(X) and AT-(X), respectively:
(2)AT_(X)={x∈U:[x]AT⊆X},AT-(X)={x∈U:[x]AT∩X≠⌀},
where [x]AT is the equivalence class based on indiscernibility relation IND(AT) and is denoted as [x]AT={y∈U:(x,y)∈IND(AT)}.
(A_(X),A¯(X)) is referred to as Pawlak’s rough set.
2.2. Multigranulation Rough Set
Multigranulation rough set is different from Pawlak’s rough set. The former is constructed on a family of the equivalence relations, and the latter is constructed on an equivalence relation. In Qian et al.’s multigranulation rough set theory, two basic models were defined. The first one is the optimistic multigranulation rough set, and the second one is the pessimistic multigranulation rough set.
Definition 2.
Let S be an information system in which A1,A2,…,Am⊆AT; for all X⊆U, the optimistic multigranulation lower and upper approximations are denoted by ∑i=1mAi_O(X) and ∑i=1mAi¯O(X), respectively:
(3)∑i=1mAi_O(X)={x∈U:[x]A1⊆X∨⋯∨[x]Am⊆X}∑i=1mAi¯O(X)=~∑i=1mAi_O(~X),
where ~X is the complementary set of X.
Definition 3.
Let S be an information system in which A1,A2,…,Am⊆AT; for all X⊆U, the pessimistic multigranulation lower and upper approximations are denoted by ∑i=1mAi_P(X) and ∑i=1mAi¯P(X), respectively:
(4)∑i=1mAi_P(X)={x∈U:[x]A1⊆X∧⋯∧[x]Am⊆X}∑i=1mAi¯P(X)=~∑i=1mAi_P(~X),
where ~X is the complementary set of X.
2.3. β Multigranulation Rough Set
Optimistic and pessimistic are two special cases of multigranulation rough set. Optimistic case is loose since if only one equivalence class of an object is contained in the target, then such object is included into lower approximation; pessimistic is strict since if all the equivalence classes of an object are contained in the target, then such object is included into lower approximation. To solve this problem, Xu et al. [13] proposed a more generalized multigranulation rough set, called β multigranulation rough set that are designed by a threshold β for controlling the number of the equivalence classes, which are contained in the target.
Definition 4.
Let S be a multigranulation decision system; for all x∈U and X⊆U, the characteristic function is defined as
(5)CXi(x)={1:[x]Ai⊆X0:otherwise,
where Ai∈AT.
Definition 5.
Let S be a multigranulation decision system; for all X⊆U, the β multigranulation lower and upper approximations of X are denoted by
(6)∑i=1mAi_β(X)={x∈U:∑i=1mCXi(x)m≥β};∑i=1mAi¯β(X)={x∈U:∑i=1m(1-C~Xi(x))m≻1-β},
where β∈(0,1]. ~X is the complementary set of X.
(∑i=1mAi_β(X),∑i=1mAi¯β(X)) is referred to as β multigranulation rough set of X.
2.4. Fuzzy Rough Set
Fuzzy rough set is a generalization of rough set. It can be used for decision information system to deal with continuous types of conditional attributes. Usually a fuzzy similarity relation is computed by conditional attributes and is employed to measure similarity between two objects, which then develop upper and lower approximations of fuzzy sets. Fuzzy rough set generalize the objects discussed in rough set to fuzzy set and turn the equivalence relation to fuzzy equivalence relation.
Definition 6.
Let U≠⌀ be a universe of discourse and ℛA a fuzzy similarity relation of U; for all F∈ℱ(U), the fuzzy lower and upper approximations of F are denoted by
(7)ℛA_(F)(x)=∧y∈US(1-ℛA(x,y),F(y)),ℛA¯(F)(x)=∨y∈UT(ℛA(x,y),F(y)).
In fuzzy rough set, measurement should be introduced to construct fuzzy similarity relation, such as the max-min method. Then fuzzy similarity matrix can be constructed by fuzzy similarity relation; after that, fuzzy equivalent matrix can be constructed in terms of fuzzy similarity matrix by transitive closure method:
(8)M(R)=(r11r12⋯r1nr21r2n⋯r2n⋯⋯⋯⋯rn1rn2⋯rnn),
where rij∈[0,1] is the relation value of xi and xj.
R is a fuzzy equivalence relation if R satisfies reflectivity, symmetry, and transitivity.
2.5. Multigranulation Fuzzy Rough Set
In [15], Qian et al. introduced the theory of fuzzy set into multigranulation rough set to construct the optimistic and pessimistic multigranulation fuzzy rough sets.
Definition 7.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and Ddecision attribute; for all X⊆U, the optimistic multi-granulation fuzzy lower and upper approximations of X are denoted by
(9)∑i=1mAi~_O(X)={x∈U:[x]A1~⊆X∨[x]A2~⊆X∨⋯∨[x]Am~⊆X};(10)∑i=1mAi~¯O(X)=~(∑i=1mAi~_O(~X)),
where [x]Ai~={y∈U:(x,y)∈IND (AT~)} is the fuzzy equivalent class of x. ~X is the complementary set of X.
(∑i=1mAi~_O(X),∑i=1mAi~¯O(X)) is optimistic multigranulation fuzzy rough set.
Definition 8.
Let S be a fuzzy decision information system,A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and D decision attribute; for all X⊆U, the pessimistic multigranulation fuzzy lower and upper approximations of X are denoted by
(11)∑i=1mAi~_P(X)={x∈U:[x]A1~⊆X∧[x]A2~⊆X∧⋯∧[x]Am~⊆X};∑i=1mAi~¯P(X)=~(∑i=1mAi~_P(~X)),
where (∑i=1mAi~_P(X),∑i=1mAi~¯P(X)) is pessimistic multigranulation fuzzy rough set. ~X is the complementary set of X.
3. β Multigranulation Fuzzy Rough Sets
In β multigranulation rough sets, by setting different values of β, we can get different reductions from which the most suitable reduction can be used in next research. The fuzzy rough set is very suitable for big continuous data set. So it is natural to introduce the theory of fuzzy set into β multigranulation rough sets to construct β multigranulation fuzzy rough sets model.
In this section, we will give some definitions of β multigranulation fuzzy rough sets model and discuss some properties of it.
Definition 9.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and D is decision attribute; for all x∈U and X⊆U, the characteristic function is defined as
(12)CXi~(x)={1:[x]Ai~⊆X0:otherwise,
where [x]Ai~={y∈U:(x,y)∈IND(AT~)} is the fuzzy equivalent class of x. Then β multigranulation fuzzy rough set is defined as follows.
Definition 10.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and D decision attribute; for all X⊆U, the β multigranulation fuzzy lower and upper approximations of X are denoted by
(13)∑i=1mAi~_β(X)={x∈U:∑i=1mCXi~(x)m≥β};(14)∑i=1mAi~¯β(X)={x∈U:∑i=1m(1-C~Xi~(x))m≻1-β};
where (∑i=1mAi~_β(X),∑i=1mAi~¯β(X)) is β multigranulation fuzzy rough set. ~X is the complementary set of X.
Following Definition 10, we will employ the following denotations:
positive region of X: POSAT~β(X)=∑i=1mAi~_β(X);
negative region of X: NEGAT~β(X)=U-∑i=1mAi~¯β(X);
boundary region of X: BNDAT~β(X)=∑i=1mAi~¯β(X)-∑i=1mAi~_β(X).
Theorem 11.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and D decision attribute; for all X⊆U,
(15)∑i=1mAi~_1/m(X)=∑i=1mAi~_O(X),(16)∑i=1mAi~¯1/m(X)=∑i=1mAi~¯O(X),(17)∑i=1mAi~_1(X)=∑i=1mAi~_P(X),(18)∑i=1mAi~¯1(X)=∑i=1mAi~¯P(X).
Proof.
We only prove (15); others can be proven analogously.
For all x∈∑i=1mAi~_1/m(X), by (13), there exist ∑i=1mCXi~(x)/m≥1/m, ∑i=1mCXi~(x)≥1; there must be Ai~∈AT~ such that CXi~(x)=1, from which we can conclude that [x]Ai~⊆X, x∈∑i=1mAi~_O(X).
For all x∈∑i=1mAi~_O(X), by (9), there exists Ai~∈AT~ such that [x]Ai~⊆X. Therefore, by CXi~(x)=1 and ∑i=1mCXi~(x)≥1, we can get ∑i=1mCXi~(x)/m≥1/m such that x∈∑i=1mAi~_1/m(X).
Theorem 11 shows that if β=m-1, β multigranulation fuzzy rough set turns to optimistic multigranulation fuzzy rough set. If β=1, β multigranulation fuzzy rough set turns to pessimistic multigranulation fuzzy rough set. Obviously, β multigranulation fuzzy rough set is an extension of optimistic multigranulation fuzzy rough set and pessimistic multigranulation fuzzy rough set.
Theorem 12.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, and D decision attribute; for all X⊆U, β∈(0,1], we can get
(19)∑i=1mAi~_P(X)⊆∑i=1mAi~_β(X)⊆∑i=1mAi~_O(X),(20)∑i=1mAi~¯O(X)⊆∑i=1mAi~¯β(X)⊆∑i=1mAi~¯P(X).
Proof.
For all x∈∑i=1mAi~_P(X), by (17), there exists x∈∑i=1mAi~_1(X); by (13), there must be x∈∑i=1mA~i_β(X), such that ∑i=1mAi~_P(X)⊆∑i=1mA~i_β(X).
For all x∈∑i=1mA~i_β(X), by (13), there exists ∑i=1mCXi~(x)/m≥β; there must be [x]Ai~⊆X such that CXi~(x)=1; then ∑i=1mCXi~(x)/m≥1/m; by (15), there exists x∈∑i=1mAi~_O(X) such that ∑i=1mAi~_β(X)⊆∑i=1mAi~_O(X).
So ∑i=1mAi~_P(X)⊆∑i=1mAi~_β(X)⊆∑i=1mAi~_O(X).
Formula (20) can be proven analogously.
4. Reduction of β Multigranulation Fuzzy Rough Sets
In single granular fuzzy rough set, reduction is a minimal subset of the attributes, which is independent and has the same discernibility power as all of the attributes. The method of preserving the positive region is usually used for attribute reduction. In this paper, we consider each attribute as a granular space. It is natural to introduce this method into β multigranulation fuzzy rough set for granular space reduction.
Definition 13.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, D is decision attribute, U/IND (D)={X1,X2,…,Xn} is the partition induced by a set of decision attributes D, and approximation qualities of U/IND (D) in terms of β multigranulation fuzzy rough set are defined as
(21)γ(AT~,β,D)=|⋃{∑i=1mAi~_β(Xj):1≤j≤n}||U|,
where, |X| is the cardinal number of set X.
4.1. Significance of GranulationDefinition 14.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, D is decision attribute, and B~ is a reduction if and only if
γ(B~,β,D)=γ(AT~,β,D);
for all B′⊆B~,γ(B′,β,D)≠γ(AT~,β,D).
By Definition 10, we can get a reduction of S when preserving the approximation quality.
Definition 15.
Let S be a fuzzy decision information system, A1~,A2~,…Am~⊆AT~ are m fuzzy subsets, D is decision attribute, and B~⊆AT~; for all Ai~∈B~, the significance of granulation of Ai~ in terms of D is defined as
(22)sigin(Ai~,B~,D)=γ(B~,β,D)-γ(B~-Ai~,β,D),
where sigin(Ai~,B~,D) represents the changes of the approximation quality if a set of attributes Ai~ is eliminated from AT~. Also, we can define
(23)sigout(Ai~,B~,D)=γ(B~∪Ai~,β,D)-γ(B~,β,D)
for all Ai~∈AT~-B~, sigout(Ai~,B~,D) represents the changes of the approximation quality if a set of attributes Ai~ is put in AT~. These two significances can be used to forward granular structure selection algorithm, and sigin(Ai~,B~,D) can determine the significance of every granulation in terms of the approximation quality.
4.2. Granular Space Reduction Algorithm
See Algorithm 1.
Algorithm 1: Find granular space reduction.
Input: a fuzzy decision information system
S = 〈U,AT~∪D〉
Output: a granular space reduction RED
Step 1. ∀Ai~∈AT~, compute fuzzy similarity matrix
Step 2. ∀Ai~∈AT~, compute fuzzy equivalence matrix in terms of the result of Step 1,
To demonstrate the above approach, we use 5 data sets gotten from UCI Repository of Machine Learning databases; the description of the selected data sets is listed in Table 1.
Data description.
ID
Data set
Samples
Feature
1
biodeg
1055
41
2
Ionosphere
351
34
3
Parkinsons
196
22
4
sonar
208
60
5
wdbc
569
30
In this experiment, each feature is used to construct a granular structure. All features then correspond to multiple granular structures. For each data set, 5 different β are used; then the different results of granular selection of Table 1 under different values of β are listed in Table 2. In Table 2, “u” means the number of features, “0.001” is the smallest value of β, “1” is the biggest value of β, and “1/u” is bigger than 0.01 and smaller than 1.
Granular space selection.
ID
Data set
Granular selection with different β
1/u
1
0.001
0.005
0.01
1
biodeg
39
41
24
27
31
2
Ionosphere
28
34
26
26
28
3
Parkinsons
15
22
11
11
14
4
sonar
59
60
47
50
55
5
wdbc
25
30
14
21
24
Through Table 2, we get an interesting outcome that the result of granular selection of each data set is changed with different values of β. The granular selection results are increased with the increased value of β. Take for instance that when β = 0.001, we get the least number of features; when β = 1, we get the most number of features. Obviously, when β =1/u, it actually represents the optimistic multigranulation fuzzy rough set; when β = 1, it actually represents pessimistic multigranulation fuzzy rough set, which is too strict for only when all of the granulation spaces satisfy the inclusion condition between the equivalence classes and the target that the object belongs to the lower approximation. in this experiment, there are no reductions can be gotten, which shows that it is difficult to get a satisfied reduction result under pessimistic condition.
In order to test the performance of the proposed algorithm and to get a proper β, we employ neural network and decision tree as the validation function. The results are listed in Tables 3 and 4.
Accuracy with neural net.
ID
Data set
Accuracy with different β
1/u
1
0.001
0.005
0.01
1
biodeg
88.152
87.204
86.161
85.782
86.161
2
Ionosphere
83.191
88.604
85.755
85.755
83.191
3
Parkinsons
75.385
78.462
76.923
76.923
75.385
4
sonar
72.596
73.077
77.885
70.673
75.481
5
wdbc
89.279
89.279
62.742
62.742
74.165
Accuracy with decision tree.
ID
Data set
Accuracy with different β
1/u
1
0.001
0.005
0.01
1
biodeg
94.692
94.123
94.028
94.787
95.64
2
Ionosphere
96.581
99.43
96.866
96.866
95.581
3
Parkinsons
92.308
96.41
95.897
95.897
92.308
4
sonar
98.558
98.558
96.635
96.635
98.558
5
wdbc
97.891
98.77
98.418
98.594
99.297
We can find in Tables 3 and 4 that when β = 1, the selected features are just the same as the original data sets, but the accuracy is not always the biggest, such as the result of “biodeg” in Table 4; when β = 0.005, the accuracy is bigger than β = 1, which shows that when β is set a suitable value it cannot only reduce redundant granular space but also retain the most useful granular space so as to get better accuracy. The selection of granular space is crucial to the performance of the sequent learning, so the selection should reflex the structure of the data and patterns. Comparing the results of Table 3 with Table 4, we can see that the accuracy of Table 4 is bigger than Table 3; it shows that the accuracy is also related to the performance of classifier.
This experiment shows that the proposed algorithm is more flexible for selecting granular space than optimistic and pessimistic multigranulation fuzzy rough sets, which can use fewer features to get higher accuracy.
5. Conclusions
In this paper, a β multigranulation fuzzy rough set model is proposed, and a corresponding algorithm is proposed; different from other methods, the proposed algorithm is constructed on multigranular spaces, our experiment shows that the granular selection results are increased with the increased value of β, and the algorithm cannot only deal with continuous data but also when β is set properly, the reduction will be suitable to be classified to get a good result. The experiment result shows the effectiveness of our algorithm.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Acknowledgments
This work was supported by the Graduate Student Innovation Plan of Jiangsu Province (no. CXZZ12-0728), the National Natural Science Foundation of China (nos. 61203024, 61100116, and 71171100), the Natural Science Fundamental Research Project of Jiangsu Colleges and Universities (no. 12KJB120001), and Key Laboratory of Intelligent Perception and Systems for High-Dimensional Information (Nanjing University of Science and Technology), Ministry of Education (no. 30920130122005).
QianY.LiangJ.DangC.Incomplete multigranulation rough set20104024204312-s2.0-7724917524210.1109/TSMCA.2009.2035436QianY.LiangJ.YaoY.DangC.MGRS: a multi-granulation rough set2010180694997010.1016/j.ins.2009.11.023MR2578357ZBL1185.68695QianY. H.LiangJ. Y.WeiW.Pessimistic rough decision2011255440449PawlakZ.1991SpringerPawlakZ.SkowronA.Rudiments of rough sets2007177132710.1016/j.ins.2006.06.003MR2272732ZBL1142.68549PawlakZ.SkowronA.Rough sets: some extensions20071771284010.1016/j.ins.2006.06.006MR2272733ZBL1142.68550PawlakZ.SkowronA.Rough sets and Boolean reasoning20071771417310.1016/j.ins.2006.06.007MR2272734ZBL1142.68551ZhuW.Relationship between generalized rough sets based on binary relation and covering2009179321022510.1016/j.ins.2008.09.015MR2473013ZBL1163.68339ZhuW.WangF.-Y.On three types of covering-based rough sets2007198113111432-s2.0-3434724317110.1109/TKDE.2007.1044LinT. Y.Granular computing I: the concept of granulation and its formal model200911214210.1504/IJGCRSIS.2009.026723YangX.ZhangM.Dominance-based fuzzy rough approach to an interval-valued decision system20115219520410.1007/s11704-011-0331-4MR2861981ZBL1267.68300YangX.LinT. Y.Knowledge operations in neighborhood systemProceedings of the IEEE International Conference on Granular Computing (GrC '10)August 20108228252-s2.0-7795860235110.1109/GrC.2010.58XuW. H.ZhangX. T.WangQ. R.HuangD.-S.GanY.PremaratneP.HanK.A generalized multi-granulation rough set approach20126840681689Lecture Notes in Bioinformatics10.1007/978-3-642-24553-4_90XuW.WangQ.ZhangX.Multi-granulation fuzzy rough sets in a fuzzy tolerance approximation space2011134246259MR2903547QianY.LiangJ.DangC.Incomplete multigranulation rough set20104024204312-s2.0-7724917524210.1109/TSMCA.2009.2035436YangX.SongX.DouH.YangJ.Multi-granulation rough set: from crisp to fuzzy case2011115570MR2783760XuW. H.WangQ. R.LuoS. Q.Mult i-granulat ion fuzzy rough sets20142613231340XuW. H.WangQ. R.ZhangX. T.Multi-granulation rough sets based on tolerance relations20131771241125210.1007/s00500-012-0979-1DubiosD.PradesetsH.Rough fuzzy sets and fuzzy rough1990172-319120910.1080/03081079008935107XuW.LiY.LiaoX.Approaches to attribute reductions based on rough set and matrix computation in inconsistent ordered information systems20122778912-s2.0-8485595614010.1016/j.knosys.2011.11.013XuW.-H.ZhangX.-Y.ZhongJ.-M.ZhangW.-X.Attribute reduction in ordered information systems based on evidence theory20102511691842-s2.0-7795758082310.1007/s10115-009-0248-5LiangJ.WangF.DangC.QianY.An efficient rough feature selection algorithm with a multi-granulation view201253691292610.1016/j.ijar.2012.02.004MR2930712LinG.QianY.LiJ.NMGRS: neighborhood-based multigranulation rough sets20125371080109310.1016/j.ijar.2012.05.004MR2956496ZBL1264.68176SangY. L.QianY. H.A granular space reduction approach to pessimistic multi-granulation rough sets2012253361366HuQ.YuD.XieZ.Information-preserving hybrid data reduction based on fuzzy-rough techniques20062754144232-s2.0-3264444035310.1016/j.patrec.2005.09.004