Stability Analysis Based on Caputo-Type Fractional-Order Quantum Neural Networks

In this paper, a quantum neural network with multilayer activation function is proposed by using multilayer Sigmoid function superposition and learning algorithm to adjust quantum interval. On this basis, the quasiuniform stability of fractional quantum neural networks with mixed delays is studied. According to the order of two di ﬀ erent cases, the conditions of quasi uniform stability of networks are given by using the techniques of linear matrix inequality analysis, and the su ﬃ ciency of the conditions is proved. Finally, the feasibility of the conclusion is veri ﬁ ed by experiments.


Introduction
Fractional calculus is an arbitrary extension of integer calculus in order. It has strong advantages and wide application prospects in the fields of physics, chemistry, biology, economy, control, signal, and image processing. It has attracted extensive attention from scholars at home and abroad and has become one of the current research hotspots. In recent years, due to the continuous development of fractional differential equations, many researchers began to pay attention to the fractional-order theory, and the combination of fractional-order and neural network give full play to the advantages of fractional order. For example, literature [1][2][3][4] combined fractional order with neural network and achieved a good effect. Among them, Boroomand and Menhaj [4] presented the fractional-order Hopfield neural network model and studied its stability through the quasienergy function. [5][6][7] study on different fractional-order neural networks and explore the influence of different factors on fractionalorder neural networks. This paper summarizes the synchronization problem of neural network [8][9][10][11][12]. Dominik et al. [13] considered discrete fractional-order artificial neural networks. Chaos and chaotic synchronization of fractionalorder neural networks are proposed [14]. Literature [15,16] explained and analyzed the dynamics of fractional-order neural networks. The fractional-order neural network was applied in different fields [17][18][19][20][21]. In recent years, the stability of fractional-order neural network system has become a research hotspot [22][23][24][25][26][27][28][29][30][31]. In reference [22], the stability and passivity of a memristor-based fractional-order competitive neural network (MBFOCNN) are analyzed by using Caputo's fractional derivative. The effectiveness of the proposed results is finally verified by using analysis techniques and other computational tools. In reference [23], the problem of robust dissipation of Hopfield-type complex valued neural network (HTCVNN) model with time-varying delay and linear fractional uncertainty is studied, and many numerical models are designed to verify the results. In reference [24,25], the global asymptotic stability of fractional quaternion numerical bidirectional associative memory neural networks (FQVBAMNNs) and fractional quaternion numerical memristic neural networks (FOQVMNNs) is studied. The effectiveness of the results is proved by using related methods. In reference [26,27], the stability of fractional-order continuous time quaternion numerical leaky integral echo state neural network (NN) with multiple time-varying delays is studied, and the feasibility of the method is verified by numerical examples. In reference [28], the uniform stability of a fractional-order leaky integral echo state neural network (FOESN) with multiple delays is studied. The simulation results show the effectiveness of the method. Literature [32,33] proposed the time-delay correlation study of Caputo fractional-order neural network. However, there are few studies on the behavior of fractional quantum neural networks with mixed delay. In this paper, a multilayer activation function quantum neural network model is presented, and the quasiuniform stability of fractional quantum neural networks with mixed delay is studied. It is proved by the formula and simulated by the numerical case.
This article is organized as follows. In the second section, we give the structure of the multilayer activation function of the quantum neural network, based on which a fractional quantum neural network model with mixed delay is proposed. In the third section, it is proved that the fractional quantum neural network system with mixed time delay is quasiuniformly stable by corresponding definitions and lemmas. In the fourth section, a concrete example is given to verify the validity and applicability of the given results.

Model Composition and Preparation
2.1. Quantum Neural Network. Quantum neural network belongs to the feed-forward type of neural network [34,35]. Compared with the traditional feed-forward type of neural network, the neurons in the hidden layer of quantum neural network refer to the idea of quantum state superposition in the quantum theory and carry out the linear superposition of several Sigmoid functions, which is called the multilayer activation function. Traditional activation functions can only represent two states and orders of magnitude. When quantized, a hidden layer neuron can represent more states and orders of magnitude.
Each Sigmoid function superimposed has a different quantum interval. By adjusting the quantum interval, the data of different classes can be mapped to different orders of magnitude or steps, so that the classification can have more degrees of freedom. The quantum interval of the quantum neural network can be obtained by training. The uncertainty in the sampled data can be obtained and quantified by a quantum neural network with an appropriate learning algorithm. Figure 1 shows a traditional three-layer feedforward neural network. Assume that the input layer I has n nodes, the output layer O has k nodes, and the number of nodes in the hidden layer H is m. Adjacent layer nodes are fully interconnected, and nodes of the same layer are not connected. The node output function in the hidden layer is The output function of the node in the output layer is In the formula, f adopts Sigmoid function, and W is the connection weight vector between each neuron in the input layer and each neuron in the hidden layer. V is the connection weight vector between each neuron in the hidden layer and each neuron in the output layer; θ is the threshold of the hidden layer, and h is the threshold of the unit of the output layer.
Quantum neural networks with multiple excitation functions: In the formula: f ðxÞ = 1/ð1 + exp ð−xÞÞ, W is the network weight vector; X is the network input vector; U is the slope; W T X is the input excitation of the quantum neuron; θ s is the quantum interval ðs = 1, 2, ⋯, nÞ.
The learning of quantum neural network can be divided into two steps: (1) adjusting the weight to make the input data correspond to different class spaces; (2) adjust the quantum interval of quantum neurons in the hidden layer to reflect the uncertainty of data. The BP algorithm is used to adjust the weight. Once the network weight is obtained, the quantum interval can be adjusted by an appropriate algorithm [36]. The idea of the algorithm is to minimize the output change of the hidden layer neurons in the quantum neural network based on the same kind of sample data.
Input layer Hidden layer Output layer Figure 1: Three-layer feedforward neural network graph structure.

Journal of Function Spaces
Assume that for class C m , the output of the ith hidden layer neuron changes as: in the formula: O i , k represents the output of the i th neuron in the hidden layer when the network input vector is x k ; jCmj in the formula represents the cardinality of class jCmj. It can be seen that e 2 i,m is a function of the quantum interval θ s . By taking the derivative of θ s ðs = 1, ⋯, nÞ on both sides of Equation (4) and finding the minimum value of e 2 i,m , the variation formula of θ i,s (i.e., layer S of the ith neuron in the hidden layer) can be obtained.
In formula (6), Z is the learning rate; k 0 is the number of nodes in the output layer, namely, the total number of classes; k is the number of quantum interval layers; x k : x k ∈ C m represents all samples belonging to the C m class.
Among them: In the formula: O i,k,s = sigðU * ðW T x k − θ s ÞÞ represents the output of the sth quantum layer of the ith hidden layer neuron when the input vector is x k ðs = 1, 2, ⋯, nÞ.

Caputo-Type Fractional Derivative Definition.
In definition, f ðsÞ is a continuous function over R, for any β>0; the β-order Caputo-type derivative of 0,f ðsÞ is defined as: The following corollary can be drawn: (1) If vector x = ðx i Þ and matrix A = ða ij Þ, we define the Euclidean norm kxk of vector x to be kxk = ∑jx i j. The matrix norm of the matrix k Ak is defined as k A k = max 1≤i≤n ∑ja ij j. In this paper, we set (2) The excitation functions FðxÞ, GðxÞ, and HðxÞ of the fractional quantum neural network with mixed delay both satisfy the Lipschiz condition, that is, for any u The fractional quantum neural network model with mixed time delay is shown below: is converted to: Among them, 0 < β < 1, ði = 1, 2, ⋯, nÞ, n represents the number of neurons in a fractional quantum neural network with mixed delay, and xðtÞ = ðx 1 ðtÞ, x 2 ðtÞ,⋯,x n ðtÞÞ ∈ R is the state vector of the neuron at time t.
and HðxðtÞÞ = ðh 1 ðx 1 ðtÞÞ, h 2 ðx 2 ðtÞÞ, ⋯, h n ðx n ðtÞÞÞ T are the activation function of fractional quantum neural network; C = diag ðc i > 0Þ, A = ða ii Þ, B = ðb ij Þ, and M = ðm ij Þ are all constant matrices; c i > 0 represents the rate of the isolated resting state of the first neuron in the fractional-order quantum neural network in the state of unconnected and without external additional voltage difference; a j , b ij , and m ij represent the weight of the connection between the jth neuron and the ith neuron; τ j and σ i represent the transmission delay of the jth neuron along the axon; and I = ðI 1 ðtÞ, I 2 ðtÞ, ⋯, I n ðtÞÞ T represents the external input and deviation of the neuron.

Journal of Function Spaces
Set the initial conditions of the system, usually assuming ψ i ðsÞ ∈ Cð½−γ, 0RÞ, i ∈ N + , and the norm on C is defined as kψk = sup kψðsÞk.

Main Result
Relevant definition: If xðsÞ ∈ C n ½0,+∞Þ and n − 1 < α, β < n ∈ Z + then Lemma 2 (Hölder inequality). Suppose that the real number p, q > 1, and p, q satisfies ð1/pÞ + ð1/qÞ = 1, if j f ð·Þj p , jhð·Þj q is a measurable function in space, and f , g : is also a measurable function and satisfies In particular, when p = q = 2, it is the inequality that we usually see. That is Lemma 3. Let k ∈ N, x 1 , x 2 , ⋯, x k be a nonnegative real number, then it can be obtained for any Lemma 4 (Gronwall inequality). If xðtÞ, f ðtÞ, gðtÞ ≥ 0 is a continuous function on ½0, TÞ, T < ∞ and satisfies the following inequality Then, we can get In special cases, if f ðtÞ is a nonincreasing function, you can get Definition 5. The initial time of the fractional quantum neural network system (11) with mixed delay is set to t 0 . For any ξ > 0, there are two constants δ and T, 0 < δ < ξ, T > 0, so that for any t ∈ J = ½t 0 , t 0 + T, when keðt 0 Þk < δ has keðtÞk < ξ, then the system (11) is called quasiuniformly stable.
Proof. We set the initial time t 0 = 0 of the error system (12), the initial condition is e 0 = φð0Þ, and the expression of the solution of the error system can be obtained from Lemma 1 as 4 Journal of Function Spaces From the hypotheses 1 and 2 and the basic properties of the norm, we can get According to the Cauchy-Schwartz inequality in Lemma 2, we know Bring into Equation (24) to get In Lemma 3, let k = 5, η = 2, we can get Using Gronwall inequality and letting WðtÞ = L + Nt 2β ð1 − e −2t Þ, get so 5 Journal of Function Spaces that is ☐ It can be seen that when kφk < δ, keðtÞk < ξ is easy to know from Theorem 6. From Definition 5, it can be concluded that the fractional quantum neural system (11) with mixed time delay is quasiuniformly stable. Theorem 7. If the order β ∈ ð0; 0:5Þ of fractional-order quantum neural network system (11) with mixed delay is true, assuming that 1 and 2 are true and is true, whereP then the system (11) is quasiuniformly stable.

Journal of Function Spaces
In Lemma 3, let k = 5, η = q, we can get then let which is It can be seen that when kφk < δ, it is easy to know keðtÞk < ξ from Theorem 6. From Definition 5, it can be obtained that the fractional quantum neural system (11) with mixed time delay is quasiuniformly stable.

Illustration
In this part, we give a specific example to verify the validity and applicability of the given results.

Conclusions
This paper uses the linear superposition of multilayer activation functions, uses learning algorithms to adjust quantum intervals and other operations to quantize the neural network, and proposes a quantum neural network model with multilayer activation functions. On this basis, the quasiuniform stability of fractional quantum neural networks with mixed time delays is studied. When β belongs to different ranges, the sufficient conditions for the quasiuniform stability of the fractional quantum neural network system with mixed time delay are, respectively, discussed. Using the corresponding theorem, the proof of the theoretical result is given. Finally, through numerical simulation, the feasibility of the conclusions obtained in this paper is verified.