Particle Swarm Optimization Iterative Identification Algorithm and Gradient Iterative Identification Algorithm for Wiener Systems with Colored Noise

. This paper considers the parameter identi ﬁ cation of Wiener systems with colored noise. The di ﬃ culty in the identi ﬁ cation is that the model is nonlinear and the intermediate variable cannot be measured. Particle swarm optimization is an arti ﬁ cial intelligence evolutionary method and is e ﬀ ective in solving nonlinear optimization problem. In this paper, we obtain the identi ﬁ cation model of the Wiener system and then transfer the parameter identi ﬁ cation problem into an optimization problem. Then, we derive a particle swarm optimization iterative (PSOI) identi ﬁ cation algorithm to identify the unknown parameter of the Wiener system. Furthermore, a gradient iterative identi ﬁ cation algorithm is proposed to compare with the particle swarm optimization iterative algorithm. Numerical simulation is carried out to evaluate the performance of the PSOI algorithm and the gradient iterative algorithm. The simulation results indicate that the proposed algorithms are e ﬀ ective and the PSOI algorithm can achieve better performance over the gradient iterative algorithm.


Introduction
Almost all practical systems are nonlinear [1][2][3].Many identification methods have been developed for linear systems [4,5], bilinear systems [6][7][8], and nonlinear systems [9].The Wiener models are a typical class of nonlinear systems and are widely used in industrial production process [10,11].The Wiener nonlinear system consists of a dynamic linear subsystem and a static nonlinear subsystem and has the characteristics of complex structure between subsystems [12,13].One of the difficulties in identifying Wiener nonlinear model parameters is that the intermediate variable (the output of the linear subsystem) cannot be measured, and the identification issues for Wiener systems have attracted great attention [14].
The iterative identification method is generally used to identify the system with unknown item in the model information vector [15][16][17].The basic idea of iterative identification is to estimate the unknown items in the information vector by using the iterative parameter estimation of the previous step [18,19].The iterative identification method is an important branch of system identification, which can be realized by using gradient search, least squares principle, and Newton optimization [20][21][22].
The particle swarm optimization algorithm is an evolutionary computing technique which is based on the simulation of birds' flock [23,24].The basic idea of particle swarm optimization algorithm is to find the optimal solution through collaboration and information sharing among individuals in the group [25].This algorithm has attracted the attention of academia with the advantages of easy implementation, high precision, and fast convergence [26].Compared with the conventional optimization methods, it has excellent optimized performances and characteristics [27].The particle swarm optimization algorithm has been widely used in function optimization, system identification, and fuzzy control [28][29][30].Recently, Chen and Wang proposed a stochastic gradient algorithm and a particle swarm optimization algorithm to estimate all the unknown parameters of the Hammerstein system [31].In this paper, we use the particle swarm optimization algorithm and the gradient iterative algorithm to identify the unknown parameters of the Wiener systems with colored noise.
The rest of the paper is organized as follows.Section 2 gives the system description for the Wiener model.Section 3 gives the particle swarm optimization algorithm for Wiener nonlinear systems.Section 4 derives a gradient iterative algorithm for the discussed system.Section 5 provides an example for illustrating the results in this paper.Finally, some conclusions are given in Section 6.

System Description
Consider the Wiener system shown in Figure 1 with the following expressions: where A z , B z , and D z are polynomials in the shift operator z −1 z −1 y t = y t − 1 with Assume that the degrees n a , n b , and n d are known and y t = 0, u t = 0, and v t = 0 for t ≤ 0 Define the linear subsystem output x t as and the noise model output w t as The static nonlinear block is a nonlinear function where the basis g ≔ f 1 , f 2 , … , f n γ are known nonlinear functions of x t , the unknown parameters γ i are the coefficients of the nonlinear functions and assume that the degree n γ is known.Without loss of generality, let the first coefficient of nonlinear block γ 1 be unity and rewrite the f x t as In the above equations, u t and y t are the system input and output, respectively, and v t is a Gaussian distributed white noise with zero mean and variance σ 2 .From (3), we have where From (4), we can obtain where Thus, the Wiener nonlinear system model can be written as follows: where

The Particle Swarm Optimization Algorithm
With the development of optimization theory, some new intelligent algorithms have been proposed to solve the problem of traditional system identification, such as the 2 Complexity genetic algorithm [32], the ant colony algorithm [33], and the particle swarm algorithm [34,35], these algorithms enrich the system identification technology.Particle swarm optimization algorithm is a nature-inspired evolutionary algorithm, and it has been successful in solving a wide range of real-value optimization problems [36].In the following, the particle swarm optimization algorithm is used to identify the unknown parameters of the Wiener nonlinear systems.Suppose that the search space is n-dimensional and a particle swarm consists of M particles.
Define the information vector as Let p represent the data length.Define the stacked output vector Y p and the stacked information matrix Φ p as Define the independent position of each particle θ i and the independent velocity Q i as follows: Let θ k denote the estimates of θ at iteration k k = 1, 2, 3, … Define θ ih k as the best position of each particle at iteration k Then, the estimates Φ k p can be obtained as follows: According to the basic principle of the particle swarm algorithm, the best position of each particle θ ih k satisfies the following cost function: Let θ g k denote the global best position of all the particles where θ g k satisfies According to (7), we can obtain the estimation of xk t According to the principle of particle swarm optimization, each particle goes to a new position and a new velocity at iteration k + 1 as follows: Replacing φ t and θ with φ k t and θ g k in (11), we can obtain the estimate vk t = y t − φ T k t θ g k Thus, we can obtain the particle swarm optimization iterative (PSOI) identification algorithm as follows: Compute  ih (k + 1) Ĉompute The steps of the PSOI algorithm are listed as follows: (1) Let k = 0, set the initial values as θ i 0 , Qi 0 , θ ih 0 , and θ g 0 , i = 1, 2, … , M Set the initial factor β, ξ 1 , ξ 2 and give a positive number ε. Set x0 t = 1/p 0 , p 0 = 10 6 , and v0 t = 0.
(6) Determine the best position of all the particles θ g k + 1 by ( 34).
(8) Compare θ g k + 1 and θ g k : if θ g k + 1 − θ g k ≤ ε, then terminate the procedure and obtain the estimate θ g k + 1 ; otherwise, increase k by 1 and go to Step 2.
The flowchart of PSOI algorithm is shown in Figure 2.
Remark 1.The major factors that influence the performance of the particle swarm optimization include ξ 1 ,ξ 2 , and β. ξ 1 and ξ 2 are positive constants between 0 and 2. ξ 1 is the step size that adjusts the particle to its own best position.ξ 2 is the step size that regulates the particle to the global best position.β is called the inertia factor and is an important adjusting parameter of the PSOI algorithm.A larger β can facilitate global optimization; otherwise, a smaller one can facilitate local optimization.It can be chosen as a constant between 0.1 and 0.9 generally.ζ 1 and ζ 2 are two independent random numbers uniformly distributed in the range of [0, 1].

Gradient Iterative Algorithm
The gradient search is a very basic and ancient search method [37,38].It is widely used in parameter identification of nonlinear systems [39][40][41].In the following, based on the gradient search principle, a gradient iterative identification algorithm for Wiener nonlinear model is derived.

Complexity
Consider the latest p group data from i = t − p + 1 to i = t and define the stacked output vector Y t , the stacked information matrix Φ t , and the stacked noise vector V t as follows: 11), we have Define the criterion function Let k = 1, 2, 3, … , n as an iterative variable and θ k t is the kth iterative estimation of parameter vector θ at time t For the optimization problem (39), the gradient iterative algorithm is obtained by using the negative gradient search where μ k t is the iterative step-size.However, in the upper formula of ( 40), the gradient iterative estimate θ k t impossible to calculate because the stacked information vector Φ t contains unknown intermediate variables x t and v t The solution is to replace the unknown variables x t and v t by xk−1 t and vk−1 t , respectively.Let φ 1,k t , φ 2,k t , and φ k t denote the estimates of φ 1 t , φ 2 t , and φ t at iteration k, respectively and let θ 1,k t denote the estimates of θ 1 at iteration k Thus, xk t can be calculated by the following: T at iteration k and then the estimates of v k t can be obtained by the following: Replacing Φ t by Φ k t in (40) and rewriting θ k t give Thus, we can obtain the gradient iterative (GI) estimation algorithm for Wiener models The steps of the GI algorithm are listed as follows: (1) Let k = 1, θ 0 t = 1/p 0 , x0 t = 1/p 0 , and p 0 = 10 6 and give a small positive number ε.
(2) Collect the input and output data μ t and y t and form Y t by (49) and φ 1,k t by (51). (

Examples
Consider the following Wiener nonlinear systems:

58
In simulation, the input u t is taken as an uncorrelated stochastic signal sequence with zero mean and unit variance and v t as a Gaussian white noise sequence with zero mean and variance σ 2 0 10 2 Applying the GI algorithm and the PSOI algorithm to estimate the parameters of this system, the parameter estimates and their errors are shown in Tables 1 and 2 and Figures 3 and 4. In the PSOI algorithm    From the simulation results in Tables 1 and 2 and Figures 3 and 4, we can draw the following conclusions: (i) As k increases, the parameter estimation errors given by the GI algorithm and PSOI algorithm gradually become smaller (see Tables 1 and 2).
(ii) The PSOI algorithm has a faster convergence rate than the GI algorithm (see Figures 3 and 4).
(iii) The PSOI algorithm has a higher estimation accuracy than the GI algorithm, which can be seen from Tables 1 and 2.

Conclusions
In this paper, we derived the particle swarm optimization iterative algorithm and the gradient iterative algorithm for Wiener nonlinear systems.Compared with the gradient iterative algorithm, the particle swarm optimization algorithm has a higher estimation accuracy and has a faster convergence rate.The proposed approaches in the paper can be combined with other mathematical tools [42][43][44][45][46][47] to study the performances of some parameter estimation algorithms and can be applied to other multivariable systems with different structures and disturbance noises and other literature [48][49][50][51][52].

Figure 3 :
Figure 3: The GI estimation errors versus k.
Compare θ k t and θ k−1 t : if θ k t − θ k−1 t ≤ ε,then terminate the procedure and obtain θ k t ; otherwise, increase k by 1 and go to Step 2.