You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.

# Multi-source information fusion model in rule-based Gaussian-shaped fuzzy control inference system incorporating Gaussian density function

#### Abstract

An increasing number of applications require the integration of data from various disciplines, which leads to problems with the fusion of multi-source information. In this paper, a special information structure formalized in terms of three indices (the central presentation, population or scale, and density function) is proposed. Single and mixed Gaussian models are used for single source information and their fusion results, and a parameter estimation method is also introduced. Furthermore, fuzzy similarity computing is developed for solving the fuzzy implications under a Mamdani model and a Gaussian-shaped density function. Finally, an improved rule-based Gaussian-shaped fuzzy control inference system is proposed in combination with a nonlinear conjugate gradient and a Takagi-Sugeno (T-S) model, which demonstrated the effectiveness of the proposed method as compared to other fuzzy inference systems.

## 1Introduction

The human brain obtains information from different sources; it then merges this information to form concepts and finally outputs natural language (NL), which is powerful and versatile enough to describe the real world. NL can be regarded as the fusion of disparate information; it is vague, ambiguous, and uncertain. The quantitative calculation and qualitative analysis of NL is the ultimate goal of artificial intelligence. There are two strands of research linking the initial information acquisition with NL: (1) how to simplify the presentation of NL and (2) how to form NL from multi-source information. Usually, humans express emotions of certain objects by using sentences and affective words, but they cannot fully express their intuitive perception of an object simply through separating these terms. Natural Language Processing (NLP) was developed to solve this problem; however, many difficulties remain in this field. Computing with Words (CW) was also introduced to decrease the complexity related to linguistic variables [16–18]. This has allowed for a more exact expression of the meaning of what a human is thinking about and has provided a feasible direction for NLP under weakened conditions. Zadeh introduced the framework of this phenomenon of uncertainty using Fuzzy Sets (FS) in 2005 [19]. The FS theory was also addressed to describe objects at a coarse-grained level. Herrera and Martínez [5] introduced a 2-tuple fuzzy linguistic representation model for CW without any loss of information. Furthermore, Lawry [13, 14] proposed Label Semantics (LS) for vague concept modeling and reasoning techniques so as to formalize uncertainty in presentation theory. Subsequently, Lawry and Tang [12, 34, 35] proposed a new semantic understanding model: the Prototype Theory (PT). These works discovered the connection between fuzzy presentation technology and high-level semantics. In engineering fields, linguistic representation models combined with affective words have had some applications, such as fuzzy decision making [21, 31] and KANSEI Engineering (KE). Fuzzy inference methodologies have also been shown to be effective in our previous work on Rough Sets [7] and Fuzzy Support Vector Machines (SVMs) [6].

However, it has been regarded as more feasible to focus on multi-source information fusion rather than on NL itself. Moreover, it is important to discover the mechanics of integrating multi-source information in the human brain. Due to the modular and vague appearance of multi-source information, uncertainty reasoning methods and their associated mathematical tools are thought to offer more interpretability and a much stronger generalization capability [24]. Yager developed the theoretical foundation for multi-source information fusion techniques based on set measure and possibility theories [25, 26]. Normally, single-source information consists of steady features that are more easily formalized and parameterized. In previous studies, the sum, product, max/min, and Weighted Arithmetic Mean (WAM) were used to combine single-source information, and each output represented an independent source of information that could be treated separately [15].

Relative to mathematical research and understanding the phenomenon of uncertainty, the integration of information using fuzzy inference techniques pervades many scientific disciplines, such as multivariate and type-2 fuzzy sets; bipolar models [10, 11]; and probability and possibility issues [9, 27]. Information fusion is the merging of information from disparate sources with differing conceptual, contextual, and typographical representations. It has been successfully applied in data mining and the consolidation of data from unstructured or semi-structured resources, and it has also led to many achievements in various fields [1, 4, 8]. Fusion methods include product fusion (such as the Bayes posterior probability model), linear fusion (SVM classifiers), and nonlinear fusion (super-kernel integration) [23]. Recent developments and applications of fuzzy information fusion can be found in pattern classification, image analysis, decision-making, man-made structures, and medicine [30, 32]. Furthermore, over the past several years, there has been a number of successful applications of fuzzy integrals in decision-making and pattern recognition that have employed multiple information sources [3, 20].

In this paper, we formalize multi-source information as a multivariable group and describe each information structure as a special kind of triple, I =  < P,  d,  ρ >, where P denotes a typical point of positive examples relative to the information structure I, d is a distance measurement that represents the population of information, and ρ is a Probability Density Function (PDF). The basic idea of this formalized information structure is to assume that the neighborhood radius of each information structure is uncertain, which is limited by PDF-ρ. Thus, we will calculate the value of P relative to an information structure under a given level. An information fusion technique was developed by formalizing this special information structure; furthermore, information fusion employing fuzzy sets was applied in this paper. A Single Gaussian Model (SGM) was applied to single-source information, and a Gaussian Mixed Model (GMM) was applied to the fusion of this information by incorporating probabilistic and statistical methods [28, 36].

The remainder of this paper proceeds as follows. In Section 2, we propose an information structure that incorporates a definition of the information kernel, boundary, and Gaussian PDF. An improved algorithm for parameter estimation is also introduced. Section 3 introduces fuzzy similarity relations and IF-THEN rules for this special information structure. These are helpful for calculating the possibilities in a rule-based fuzzy inference system (FIS). Section 4 develops a rule-based information fusion model using a conjugate gradient and Takagi–Sugeno (T-S) model under a rule-based Gaussian-shaped fuzzy inference system (RGS-FIS). A time-series analysis using natural disaster datasets is also introduced using RGS-FIS, and we demonstrate the effectiveness of our method in comparison to other methodologies. Finally, in Section 5, we give our conclusions and ideas for future work.

## 2Information fusion models by using probability density function

### 2.1Definitions

Definitions for our information structure and kernel computing method were established as follows.

Definition 1. Assume object Ω is described by the multi-source information set I = {I k |k = 1, 2, ⋯ , m} and that measure set V = {v k |k = 1, 2, ⋯ , m} is a set of information structures corresponding to set I. For ∀v k  ∈ V, we define v k  =< P k ,  d k ,  ρ k  >, where P k is a typical point as the kernel of I k . Moreover, d k is a metric of the information structure v k related to the population or scale of information and will be used for boundary computing. Lastly, ρ k is a density function on the threshold of v k .

Definition 2. Let the fusion operator be ⊕ so that Ω can be formalized as:

##### (1)
I1+I2++Im=v1v2vm
where ⊕ is a minimum operator.

Definition 3.P k ,  Q k in an n-dimensional Euclidean space R n , P k  = [P k1,  P k2, ⋯ ,  P kn ], and Q k  = [Q k1,  Q k2, ⋯ Q kn ]. Moreover, let d =∥  ∥, and it has the following properties:

• (1) d(Pk,Pk)=Pk=(iPki)

• (2) d (P k ± Q k ) = ∥ P k  ± Q k  ∥, ∀P k , Q k  ∈ R n

• (3)α, β ∈ R, P k , Q k  ∈ R n

We have d (αP k ± βQ k ) = ∥ αP k  ± βQ k  ∥; in addition,

d(αPk+βQk)|α|d(Pk)+|β|d(Qk).

Definition 4. For sample points {Pkl|l=1,2,}, the statistics-based kernel point computation is calculated as:

##### (2)
Pk=lPkl=[lPk1l,lPk2l,lPknl]
Pkil indicates the value of the i-th dimension of the l-th sample point in the k-th information source.

The boundary of v k gives the scale of the neighborhood of all elements in this special information structure. This is defined below.

Definition 5. For ∀P k  ∈ R n , there exists a neighborhood,

##### (3)
NPkɛ={X|Pk-X<ɛ,XRn}

Definition 6. For calculating the boundary of I, two sets were defined as:

- The Upper Approximation Boundary (UAB)

##### (4)
UPB={Pl|PlNPKu}

- The Lower Approximation Boundary (LAB)

##### (5)
LPB={Pl|PlNPKt}

Therefore, the boundary is P B  = UP B  ∖ LP B  = B (u, t). Thus, we have P B  = P K  + λ (P B  - P K ), λ ∈ [0, 1], which exhibits fuzziness attributes at the boundary.

### 2.2Probability density function

–Single Gaussian Model for single-source information

The Gaussian distribution is a continuous probability distribution with a bell-shaped PDF in one-dimensional space:

##### (6)
f(x,μ,σ2)=12πσe-12(x-μσ)2

The parameter μ is the mean or expectation, and σ 2 is the variance. The SGM is applied to induct the density function of the proposed information structure I, and we define:

##### (7)
δ(X,μ,Φ)=1(2π)n|Φ|e-12(X-μ)TΦ-1(X-μ)
where X is a vector in n-dimensional space, Φ is the covariance matrix, and μ is the mean value of the density function. The density function’s properties are determined by (Φ,  μ), so this is a parameter estimation problem [29]. For any point P i  ∈ R n , its probability density function is δ (P i ,  μ,  Φ), and if, for any information structure v k , each P i in v k is regarded as an independent event, then the PDF of v k is:
##### (8)
δk=δ(vk,μ,Φ)=imδ(Pi,μ,Φ)

The maximum likelihood estimation can be used to estimate the parameters (Φ,  μ) under (8). Taking the logarithm of (8), we have:

##### (9)
O(μ,Φ)=ln(imδ(Pi,μ,Φ))=imln(δ(Pi,μ,Φ))=im-n2ln(2π)-12ln|Φ|
+12(Pi-μ)TΦ-1(Pi-μ)]=-nm2ln(2π)-m2ln|Φ|-m2i[Pi-μ)TΦ-1(Pi-μ)]

Taking the partial derivative w.r.t. μ of O (μ, Φ) and setting it to 0, we obtain the following:

##### (10)
μ(O(μ,Φ))=-12im[-2Φ-1(Pi-μ)=Δ-1im[(Pi-μ)]=Δ-1[imPi-mμ]=0

This gives μˆ=12iPi. Similarly, for Φ, we can obtain Φˆ=1n-1i(Pi-μˆ)(Pi-μˆ)T. Thus, if the density of each point in v k is δ(P,μˆ,Φˆ), then our estimation of the parameter μ is:

##### (11)
μˆ=(1nie1i,1nie2i,1nieni)
where e li is the coordinate of P i in R n .

The covariance Φˆ is converted to

##### (12)
Φˆ=1n-1i[e1i-μˆ1,e2i-μˆ2,,eni-μˆn][e1i-μˆ1e2i-μˆ2eni-μˆn]=1n-1j=1ni=1n(eji-μˆj)2

- Gaussian Mixed Model and parameter estimation

For multi-source information fusion, we need to calculate all of I k ’s density functions as well as calculate the new density function. For m multi-source information structures, let Ifusion=i=1lαiδ(P,μi,Φi) for a normalized weight parameter α: i.e., ∑ i α i  = 1. To calculate and simplify the covariance matrix Φ, let

##### (13)
Φ=[σ2000σ200000σ2]=σ2I

From the SGM, we have that

##### (14)
δ(P,μ,σ2I)=1(2π)nσ-1e-(P-μ)T(P-μ)2σ2

Calculate:

μ[δ(P,μ,σ2I)]=1(2π)nμ(σ-1e-(P-μ)T(P-μ)2σ2)=1(2π)nσ-1e-(P-μ)T(P-μ)2σ2μ(-(P-μ)T(P-μ)2σ2))=δ(P,μ,σ2I)(P-μσ2)

and

Δ[δ(P,μ,σ2I))=1(2π)n((-1)σ-2e-(P-μ)T(P-μ)2σ2)+1(2π)nσ-1e-(P-μ)T(P-μ)2σ2[(P-μ)T(P-μ)σ3]=δ(P;μ,σ2I)((P-μ)T(P-μ)σ3-1σ2)

Then, for Δ=cI, c ∈ R, GMM is defined as G (P) = ∑ i α i δ (P, μ i , σ i ), i = 1, 2, ⋯ , lf. The number of parameters for estimation is 3l. If we let θ=[α1,α2,,αl,μ1,μ2,μl,σ12,σ22,,σl2], the object is that:

##### (15)
L(θ)=ln[iG(Pi)]=iln(GPi))=iln(j=1lα1δ(Pi,μj,σj2))
which can be differentiated w.r.t. μ j and σ j . Thus, we have that:
##### (16)
μj(L(θ))=iαjδ(Pi,μj,σj2)j=1lαiδ(Pi,μj,σj2)Pi-μjσj2

Let φj(Pi)=αjδ(Pi,μj,σj2)j=1lαjδ(Pi,μj,σj2), so that:

##### (17)
μj(L(θ))=iϕj(Pi)Piμjσj2

Similarly, we can find:

##### (18)
σj(L(θ))=iαjδ(Pi;μj,σj2)j=1lαiδ(Pi;μj,σj2)(Piμj)T(Piμj)Tσj31σj2=iϕj(Pi)(Piμj)T(Piμj)Tσj31σj2

Setting the above two equations equal to 0, we have

##### (19)
μˆj=iφj(Pi)Piiφj(Pi)
##### (20)
σˆ2=13iφj(Pi)(Pi-μj)T(Pi-μj)iφj(Pi)

For α j , under the constraint ∑ j α j  = 1, we use Lagrange multipliers to re-define the object as:

##### (21)
J=L(θ)+λ(1-i=1αi)=iln(jαjδ(Pi,μj,σj2))+λ(1-i=1αi)

Differentiating this new object w.r.t. α j , we have that:

##### (22)
αjJ=iδ(Pi,μj,σj2)j=1lαjδ(Pi,μj,σj2)-λ=1αjiφj(Pi)-λ=0
##### (23)
[αˆ1,αˆ2,αˆl]=[1λiφ1(Pi)],1λiφ2(Pi),1λiφk(Pi)]
##### (24)
αˆ1+αˆ2++αˆl=1λ(i(φ1(Pi)+φ2(Pi)++φk(Pi))=1
Furthermore, we know λ = l, so:
##### (25)
[αˆ1,αˆ2,αˆl]=[1lfiφ1(Pi)],1liφ2(Pi),,1liφk(Pi)]
where φ is also a function of parameters, and we can resolve this using the following iteration:

Step 1: Let

θ=[α1,α2,αl,μ1,μ2,μl,σ12,σ22,σl2]

Given an initial value and in order to achieve convergence, μ 1, μ 2, ⋯ μ m may be calculated by the cluster method.

Step 2: Calculate φ j  (P i ).

Step 3: Calculate μ˜j=iφj(Pi)Piiφj(Pi).

Step 4: Calculate

σj=1lfiφj(Pi)(Pi-μ˜j)T(Pi-μ˜j)iφj(Pi).

Step 5: Calculate αj=1lfiφj(Pi).

Step 6: Let

θˆ=[αˆ1,αˆ2,αˆl,μˆ1,μˆ2,μˆl,σˆ12,σˆ22,σˆl2] If θ-θˆ<δ for a given threshold δ, then stop the process; otherwise, proceed to Step 2.

In actuality, the density function of information fusion under this special structure is a product of the fusion of SGMs. For all information structures v k and their SGM densities δ (v k ), δ(Ifusion)=kδ(vk); therefore, we have that:

##### (26)
kδ(vk)=k1(2π)nσke-12σk2(P-μ)T(P-μ)=1(2π)nkkσkek-12σk2(P-μ)T(P-μ)=1(2π)nkkσkeαP2+βP+γ=1(2π)nk·C·kσke-12σk2(P-μ)T(P-μ)=Cδ

where C is an undetermined constant.

In particular, in a one-dimensional space with σ = 1, we have that:

##### (27)
δ(Ifusion)=1(2π)nkkσkeαx2+βx+γ=1(2π)nkeαx2+βx+γ=1(2π)nkex2+βx+γ=C1(2π)ne(x-μ)2

This is a linear transformation of the basic Gaussian function. Thus, for any two information structures v i ,  v j , the fusion result is v ij  =< αP ij ,  βd ij ,  γδ ij  > where α,  β, and γ are undetermined coefficients.

## 3Fuzzy implications of information structure under IF-THEN rules

### 3.1Fuzzy implications of information structures under IF-THEN rules

In fuzzy sets, the rule “IF x is A¯, THEN y is B¯” indicates a fuzzy implication between A¯ and B¯ as denoted by A¯B¯. If we let x, y ∈ [0, 1] be the memberships of A¯ and B¯, respectively, we list the Mamdani model for the membership computing as t ∀x, y ∈ [0, 1], F (x, y) = Min {x, y}.

We construct a fuzzy membership based on a new fuzzy implication and inference system. We also derive a similarity relationship and apply this to the Gaussian density function-based fuzzy rule inference system. For δ k in a rule-based IF-THEN inference system, suppose that the rule set is:

IF I 1 is v 1, then I o is v o , ω 1, I 2 is v 2, then I o is v o , ω 2.

We can integrate these rules as:

IF I 1 is v 1 and I 2 is v 2, THEN I o is v o ,

##### (28)
ωij=δ(Ifusion)=δ(vij)

The Mamdani model for δ k in two-dimensional space (x,  y) will be:

##### (29)
Min(12πσ11σ12e-12[(x-μ11)2+(y-μ12)2σ112+σ122],12πσ21σ22e-12[(x-μ21)2+(y-μ22)2σ212+σ222])

In particular, if σ 11 = σ 12 = σ 21 = σ 21 = 1 and μ 11 = μ 12 = μ 21 = μ 22 = 0, we have that:

##### (30)
M(x,y)=12πe-[x2+y24]

Thus, for any other implication operators, the function of rules will have the form:

##### (31)
M(x,y)=12πe-12(Ax2+By2+Cx+Dy+E).

## 4Applications

### 4.1Mamdani model-based fuzzy control inference system using nonlinear conjugate gradient

In the previous section, information was formalized as v k  =< P k ,  d k ,  ρ k  > where P k is a central point in R n , d k  ∈ R, and ρ k is a Gaussian density function. P k and d k operate as fuzzy numbers using the fuzzy logical operation in Section 2. In our fuzzy rule-based inference system if different information (multi-source information) implies the same conclusion, then this information is integrated. Supposing that the multisource information structure v k will conclude with a particular assertion at the ρ k level, we have that:

##### (32)
IFI1isv1andI2isv2and,andImisvmTHENIoisvo,ϖ

This can be simplified to,

IFv1andv2and,,andvmTHENϖ(v1,v2,,vm)

and

IFP1andP2and,,andPmTHENP1P2Pm
IFd1andd2and,,anddmTHENd1d2dm
IFρ1andρ2and,andρmTHENϖ(ρ1,ρ2,,ρm).

Now, we only discuss the density function δ k (here ρ k ) in our FIS. Letting Θ = [δ 1, δ 2, ⋯ , δ n ], we have that:

##### (33)
IFΘTHENϖ(Θ).

From the previous section, we know that ϖ (Δ) is a Gaussian density function, so the rule is re-labeled as IF X THEN f (X). For this rule set, we have

Ri:IFXTHENfi(X)

However, as f (X) is a nonlinear function, it is difficult to find its minimum point under the Mamdani model, so we need to linearize f (X) and use the nonlinear conjugate gradient algorithm to optimize the parameters of f (X).

If we suppose that f (X) = [Ax - b T  [Ax - b], then the gradient is ∇ x f (x) =2A T  (Ax - b), and the objective is to find x subject to ∇ x f (x) =0. The nonlinear conjugate gradient requires f being twice differentiable, but as f is a Gaussian function, it is infinitely differentiable. Starting from the opposite direction as Δx 0 = - ∇  x f (x 0) with step size α, we have that:

##### (34)
α0=argminαf(x0+αΔx0)
##### (35)
x1=x0+α0Δx0

This is the first iteration in the direction of Δx 0, and by setting the initial conjugate direction s 0 = Δx 0, the following steps will calculate Δx n :

Step 1: Calculate Δx n  = - ∇  x f (x n ).

Step 2: Calculate βn:βn=ΔxnT(Δxn-Δxn-1)Δxn-1TΔxn-1. (Polak–Ribière)

Step 3: Update the conjugate direction s n  = Δx n  + β n s n-1.

Step 4: Calculate αn=argminαf(xn+αsn).

Step 5: Update x n+1 = x n  + α n s n .

The algorithm is based on the quadratic function that we use to normalize the Gaussian function f (x) in order to speed up the iterations. Considering a simplified Mamdani model and from formula (20), we know that:

##### (36)
M(x,y)=12πe-12(Ax2+By2+Cx+Dy+E)

Using the nonlinear conjugate gradient, we obtain the results given in Fig. 1 and Table 1 by comparing with other special functions. From Table 1, we know that the Gaussian density function will be approximated in just a few steps by the nonlinear conjugate gradient algorithm, which is the reason we selected the Gaussian distribution as the density function of this special structure. We also compare other forms of density function, which appear to require more steps under the nonlinear conjugate gradient algorithm.

### 4.2Takagi–Sugeno model in RGS-FIS

Takagi and Sugeno [18] proposed a fuzzy IF-THEN rules system as the local input–output relations of a nonlinear system to scale the population of rules under a multi-dimensional fuzzy inference system, known as the T-S model [21]. The normal rules for the T-S model under the special information structure proposed for our information fusion method are:

RT-S:IFINPUT-isI1,INPUT-isI2,,INPUT-nnisInTHENIf=f(I1,I2,Im).

The T-S model outputs a linear, non-constant function that will reduce the population of rules.

From rule set R T-S , we can simplify If=i=1naiδi+bidi, in which a i and b i are undetermined constants. Let the standard deviation in Equation (6) σ=I, μ = 0, and thus, If=12πi=1naiei-x22+bix. The first part of I f is a GMM model that can be estimated by Section 2.2-(2), and the second part of I f is a linear function (see Fig. 2).

Furthermore, for the nonlinear conjugate gradient proposed in Section 4.1, we obtain 100 steps and 301 gradients to find the minimum point (the Mamdani model). As a result, we can simplify this in RGS-FIS under the T-S model to output three linear membership functions. Suppose that the inputs are Gaussian-shaped rules, and the outputs are linear functions. Let the membership function of INPUT 1 and INPUT 2 be a Gaussian function, and the OUTPUT is composed of three linear functions [33]. We have this RGS-FIS system under the T-S model (see Fig. 3).

## 5Concluding remarks and future works

This paper proposed a novel information structure applicable to a Gaussian-shaped FIS. We developed the RGS-FIS approach using the nonlinear conjugate gradient algorithm and a T-S model. However, there are two problems with RGS-FIS: one is that new fusion operator parameters depend on a complex estimation process, and the other is that all data variables are supposed to be independent (r = 0). The model selection for similarity computing under rule-based fuzzy implication operations should also be improved.

Future work will focus on the pre-processing of datasets as well as the estimation of model parameters. Pre-processing will tune the parameters of the model to display a simpler mathematical presentation and assure a robust inference process. Furthermore, the fusion operator needs to be improved so that it does not solely depend on fuzzy implications. Although similarity computing is the key factor for calculating the possibility of IF-THEN rules, it is not clear whether a feasible algorithm can be developed for this. Hence, the possibility of the IF-THEN rules also needs to be calculated and improved.

## Acknowledgments

The authors would like to thank the editors, the anonymous reviewers, and Dr. Thayer El-Dajjani for their most constructive comments and suggestions to improve the quality of this paper. This work is supported by Zhejiang Provincial Natural Science Fund under No. (LY13H180012)

## References

 1 Taleb-Ahmed A, Gautier L 2002 On information fusion to improve segmentation of MRI sequences Information Fusion 3 103 117 2 Luo AC, Chen SW, Fang CY 2015 Gaussian successive fuzzy integral for sequential multi-decision making International Journal of Fuzzy Systems 2 17 321 336 3 Kutsenko DA, Sinyuk VG 2015 Inference methods for systems with many fuzzy inputs Journal of Computer and Systems Sciences International 54 3 375 383 4 Zervas E, Mpimpoudis A, Anagnostopoulos C, Sekkas O, Hadjiefthymiades S 2011 Multisensor data fusion for fire detection Information Fusion 12 150 159 5 Herrera F, Martínez L 2000 A 2-tuple fuzzy linguistic representation model for computing with words IEEE Trans Fuzzy Systems 8 6 746 752 6 Shi F, Xu J 2012 Emotional cellular-based multi-class fuzzy support vector machines on product’s KANSEI extraction Appl Math Inf Sci 6 1 41 49 7 Shi F, Sun S, Xu J 2012 Employing rough sets and association rule mining in KANSEI knowledge extraction Inform Sci 196 118 128 8 Pavlin G, de Oude P, Maris M, Nunnink J, Hood T 2010 A multi-agent systems approach to distributed Bayesian information fusion Information Fusion 11 267 282 9 Couso I, Sánchez L 2011 Upper and lower probabilities induced by a fuzzy random variable Fuzzy Sets and Systems 165 1 23 10 Aisbett J, Rickard JT, Morgenthaler D 2011 Multivariate modeling and type-2 fuzzy sets Fuzzy Sets and Systems 163 78 95 11 Lawry J, Tang Y 2012 On truth-gaps, bipolar belief and the assertability of vague propositions Artificial Intelligence 191-192 20 41 12 Lawry J, Tang Y 2009 Uncertainty modelling for vague concepts: A prototype theory approach Artificial Intelligence 173 1539 1558 13 Lawry J 1994 Inexact reasoning 66 80 University of Manchester UK PhD Thesis 14 Lawry J 2005 Modelling and Reasoning with Vague Concepts, Studies in Computational Intelligence 3 20 Berlin Springer-Verlag 15 Kuncheva L 2003 Fuzzy’ vs ‘Non-fuzzy’ in combining classifiers designed by boosting IEEE Trans on Fuzzy Systems 11 6 729 741 16 Zadeh LA 1975 The concept of a linguistic variable and its application to approximate reasoning Inform Sci 8 199 249 17 Zadeh LA 1975 The concept of a linguistic variable and its application to approximate reasoning-I Inform Sci 8 249 299 18 Zadeh LA 1975 The concept of a linguistic variable and its application to approximate reasoning-II Inform Sci 8 301 357 19 Zadeh LA 2005 Toward a generalized theory of uncertainty (GTU)-an outline Inform Sci 172 1 40 20 Sugeno M 1974 Theory of fuzzy integrals and its applications Tokyo Institute of Technology PhD thesis 21 Sozhamadevi N, Sathiyamoorthy S 2015 A probabilistic fuzzy inference system for modeling and control of nonlinear process Arabian Journal for Science and Engineering 40 6 1777 1791 22 McCauley Bush P, Wang H 1997 Fuzzy linear regression models for assessing risks of cumulative trauma disorders Fuzzy Set and Systems 92 317 340 23 Yager RR, Liu L 2008 Classic Works of the Dempster-Shafer Theory of Belief Functions Dempster AP, Shafer G Heidelberg, Germany Springer-Verlag 24 Yager RR 2012 Conditional approach to possibility-probability fusion IEEE Trans Fuzzy Systems 20 1 46 56 25 Yager RR 2012 Entailment principle for measure-based uncertainty IEEE Trans Fuzzy Systems 20 3 526 535 26 Yager RR 2011 Set measure directed multi-source information fusion IEEE Trans Fuzzy Systems 19 6 1031 1039 27 Destercke S, Dubois D, Chojnacki E 2009 Possibilistic information fusion using maximal coherent subsets IEEE Trans Fuzzy Systems 17 1 79 92 28 Nefti-Meziani S, Oussalah M, Soufian M 2015 On the use of inclusion structure in fuzzy clustering algorithm in case of Gaussian membership functions Journal of Intelligent and Fuzzy Systems 28 4 1477 1493 29 Denoeux T 2011 Maximum likelihood estimation from fuzzy data using the EM algorithm Fuzzy Sets and Systems 183 72 91 30 Pham TD 2011 Fuzzy posterior-probabilistic fusion Pattern Recognition 44 1023 1030 31 Ahram TZ, McCauley-Bush P, Karwowski W 2010 Estimating intrinsic dimensionality using the multi-criteria decision weighted model and the average standard estimator Inform Sci 180 2845 2855 32 Dou W, Ruan S, Chen Y, Bloyet D, Constans J-M 2007 A framework of fuzzy information fusion for the segmentation of brain tumor tissues on MR images Image and Vision Computing 25 164 171 33 Tung WL, Quek C 2009 A Mamdani-Takagi-Sugeno based linguistic neural-fuzzy inference system for improved interpretability-accuracy representation Proc IEEE-FUZZY 367 372 Korea 34 Tang Y, Lawry J 2009 Linguistic modelling and information coarsening based on prototype theory and label semantics Int J Approx Reasoning 50 1177 1198 35 Tang Y 2010 A prototype based rule inference system incorporating linear functions Fuzzy Sets and System 161 2831 2853 36 Zhang YQ, Ji HB 2014 Gaussian mixture reduction based on fuzzy ART for extended target tracking Signal Processing 97 232 241

## Figures and Tables

##### Fig.1

Mamdani model-based fuzzy inference system using nonlinear conjugate gradient for (1) and (2) as compared with a non-Mamdani model (3)-(4).

##### Fig.2

T-S model’s two parts under IF-THEN fusion operators and steps to minimum point using the nonlinear conjugate gradient.

##### Fig.3

RGS-FIS with two inputs and three linear output functions.