Dynamic Analysis and FPGA Implementation of New Chaotic Neural Network and Optimization of Traveling Salesman Problem

,


Introduction
Undoubtedly, the human brain is the most complex and wonderful information processing organ. It was formed by humans after a long-term natural evolution and contains approximately 100 billion neurons. ese neurons transmit information to each other to perform cognitive functions and control human behavior characteristics and thoughts. e brain is part of the central nervous system (CNS) in the structure of the human body, which is composed of a large number of neuronal cells and is connected by about 10 15 synapses, thus forming a complex neural network that transmits information in an orderly and hierarchical manner. McCulloch and Pitts abstracted human brain neurons and built a simple model to form a neural network, namely, artificial neural network [1]. Artificial neural network can be divided into three categories: shallow perceptron, simple artificial neural network, and deep neural network [2][3][4][5][6][7][8][9][10][11][12][13]. e neuron is the basic processor of the neural network. Each neuron has an output, which generally relates to its state and may affect several other neurons. Each neuron receives some input from connections called synapses. e input is the activated input neuron multiplied by the neuron's synaptic weight. e activated neuron is calculated by applying a threshold function to the product, and the threshold function is modeled by a nonlinear function. When designing a neural network, the most important thing is to ensure that the dynamic system converges on the corresponding system. On the other hand, the richer the dynamics are, the wider the range of applications would be. For example, when a neural network model is used as an approximate method to solve combinatorial optimization problems, the transient chaotic nature provides higher search performance for global optimization or approximate optimization solutions. By considering the sum of time and space of external input and feedback input from chaotic neurons, a chaotic neural network can be constructed with chaotic neurons. e study in [14] proposed a new fourdimensional chaotic neural memory cell neural network, studied its dynamic behaviors, and designed chaotic synchronization based on sliding mode control. e proposed chaotic memory CNN system can be used for secure communication. e study in [15] studied the construction of a blind restoration model for a superresolution image based on a chaotic neural network. In the paper, a simplified chaotic neural network model is first constructed. e gray value of the image is used as the input of the network. e generated Toeplitz matrix is used to calculate the connection weight and bias input of the chaotic neural network. Hence, the problem that the traditional blind restoration model for superresolution image based on neural network falls into the local minimum is solved. e study in [16] considered the circuit implementation and application of chaotic neural networks of reconfigurable memory. e chaotic neural network has been widely applied in associative memory because of its rich chaotic characteristics. In the paper, not only was circuit implementation performed, but also the autoassociative memory, heteroassociative memory, superimposed pattern separation, many-to-many associative memory, and their application in three-view drawing were realized through simulation experiments. e study in [17] examined the local synchronization control of chaotic neural networks with saturated actuators and sample data. e author of [18] studied the global power rate synchronization of chaotic neural networks with proportional delays based on impulse control. e study in [19] analyzed the sliding mode synchronization control of time-delayed chaotic neural networks based on the observer. e study in [20] proposed a chaotic neural network for encryption. e study in [21] considered the dynamic behaviors of chaotic circuits in neural networks. e study in [22] inspected the chaotic multistability problem of neural networks based on memristors.
Inspired by previous work, we simulated and studied a chaotic neural network consisting of a linear matrix, a sine function, and three chaotic neurons, one of which is affected by the sine function. In this paper, we first propose a new chaotic neural network model, followed by performing a nonlinear dynamic analysis on it, including bifurcation behavior, Lyapunov exponential spectrum, Poincaré surface of section, and basins of attraction analysis, and give out the FPGA implementation of chaotic neural network. Fewer researches have been conducted in the existing chaotic neural network research literature; therefore, the research on dynamics of this type of system is of paramount importance and meaningful.

Chaotic Neural Network Model
In this paper, based on the Hopfield neural network model, we extend the external and internal membrane conductance of neurons to the linear layer of the neural network model. A new chaotic neural network model is proposed as shown in the following equation: where x i is the voltage on the capacitor C i , S ij is the conductance of membrane resistance of the outside and inside neurons, I i is the nonlinear external input current, and the matrix W � W ij is the synaptic weight of the connection strength between neurons. e activation function of neuron V j is defined as and when C i � 1 and n � 3, the new chaotic neural network model is shown in the following equation: where A � 20; equation (3) can be rewritten as follows: Connection of the neural network with three neurons is shown in Figure 1.
e chaotic neural network proposed in this paper can be regarded as a nonlinear associative memory or contentaddressable memory, which functions to retrieve the pattern in the memory to respond to the incompleteness or noise presented by this pattern. e essence of content-addressable memory is to map a basic memory x i to a stable fixed point in a dynamic system, where the stable fixed point of the network phase space is the basic memory of the network. We can describe their specific pattern as the starting point of the phase space. If the starting point is close to the fixed point, it represents the memory to be retrieved, and the system should converge on the memory itself over time. erefore, the chaotic neural network is a dynamic system, whose phase space contains a set of stable fixed points that represent the basic system memory.
In this paper, we use the fourth-order Runge-Kutta methods to solve system (4), set the initial value to (0.1, 0.1, 0.1), and obtain the phase diagram of system (4). From the phase diagram, the system can produce a four-scroll chaotic attractor. For details, see Figures 2(a)-2(c).
However, we found that the proposed system (4) has infinitely many equilibrium points and the Lyapunov x 3 Complexity 3 exponents of system (4) are LE1 � 0.560261, LE2 � -0.001804, and LE3 � -4.056202, so system (4) is a multiscroll hidden attractor system. e Lyapunov exponents are shown in Figure 3.

Analysis of Bifurcation, Lyapunov Exponent, and Poincaré Section
e control parameter A of system (4) changes from 0 to 22, and the initial value of system (4) is (0.1, 0.1, 0.1). e step size of A is 0.04, and the bifurcation diagram of A of system (4) is shown in Figure 4(a). System (4) bifurcates from period doubling into chaos over time.
We can observe the dark lines in the bifurcation diagram. It is generally believed that all solid lines disappear after the bifurcation point; however, they can still be solved by algebraic equations; therefore, there is no reason to stop after the bifurcation point. But why cannot we see it?
is is because they have become unstable periodic orbits after bifurcation. en why cannot the unstable periodic orbit be seen in the bifurcation diagram? Essentially, this is a question of how to track unstable periodic orbits. Because it is a hidden attraction subsystem, every point on the unstable periodic orbit is unstable. As long as there is a little error, it will deviate more and more from the equilibrium point. e hidden attraction subsystem itself is affected by the initial state. In the bifurcation diagram, we can also observe hidden bifurcation phenomena.
e most essential element about the positive Lyapunov exponent is the source of local instability of the chaotic attractor. One of the most basic characteristics of chaos is its high sensitivity to initial conditions. e two orbits produced by two very close and different initial values will separate exponentially over time, causing this kind of orbit to separate exponentially. e root cause of this phenomenon is the positive Lyapunov exponent in the chaotic system. erefore, the Lyapunov exponent essentially describes the local instability in a chaotic motion. However, if there is only this local instability factor, the entire attractor will diverge, and, as a matter of fact, the chaotic attractor only exists in a certain range of phase space. erefore, we believe that there should be multistability factors in the hidden chaotic attractor, in addition to the factor of local instability. e hidden chaotic attractor is the result of the interaction of two trends, namely, local instability and multistability, finally forming the fractal structure of the whole chaotic attractor. It fully reflects the fact that the hidden chaotic attractor is a dialectical unity of local instability and multistability. In the next section of this paper, we will use the Lyapunov exponent to describe the basins of attraction of system (4) [23,24]. e analysis from Figure 4(b) shows how the positive Lyapunov exponent changes, which suggests that system (4) alternates between the quasiperiodic state and the chaotic state and fully depicts the changes in local instability and multistability of system (4). is is basically the same as the state change shown in the bifurcation diagram. e continuous trajectory of the phase space appears as some discrete mapping points on the Poincaré section. If the transition process in the initial stage is ignored, only the steady-state image of the Poincaré section is considered. When there is only one fixed point and several discrete points on the Poincaré section, it can be determined that the motion is quasi-periodic. When the Poincaré section presents a closed curve, it can be determined that the motion is quasi-periodic. When there is a dense point on the Poincaré section with a hierarchical structure, it can be determined that the motion is in the chaotic state.
e Poincaré map of the x-z plane of system (4) is shown in Figure 5. e Poincaré diagrams of the system on different planes show many dense points, which shows that the system has chaotic bifurcation characteristics and folding ability.

Analysis of Basins of Attraction
For chaotic neural networks, we can analyze the stability of the system by considering the Lyapunov exponential function (energy function) of the system. When the network is operating in the initial state, the network will move in the direction in which the Lyapunov exponential function decreases until it reaches a local minimum. e local minimum point of the Lyapunov exponential function represents the stable point of the phase space. Each attractor is around a substantial basin of attraction. In this sense, these points are also called attractors. ese basins of attraction represent a stable network state. When a stable point enters the lowest area of the basins of attraction, the solution in the network can be obtained. e size of the basins of attraction is described by the radius of attraction, which can be defined as the maximum distance between all states contained in the basins of attraction or the maximum distance at which an attractor can attract a state. e number of attractors represents the memory capacity or storage capacity of the associative memory network, while the storage capacity is the maximum number of noninterfering memories in the network within a certain tolerance of the associative error probability. e storage capacity is related to the allowable error of associative memory, network structure, learning method, and network design parameters. In short, the more attractors are present in the network, the greater the storage capacity is. e basins of attraction of the attractor act as an index to measure the fault tolerance of the network, that is, the larger the basins of attraction, the better the fault tolerance performance of the network, and the stronger the association ability of the network. In a dynamic system with multiple attractors, the corresponding basins may have fractal boundaries and an even more complex structure. erefore, this means that in system (4) a coexisting attractor will have such a complicated basins boundary structure. e red area represents the basins of attraction of the attractor at infinity, which is the point set where the trajectory diverges. e yellow area represents the basins of attraction of the chaotic attractor, which shows the coexistence of multiple attractors and the fault tolerance of the network. e larger the yellow area, the better the fault tolerance of the network. e blue area is the transition area. e section of the basins of attraction is a series of 4 Complexity symmetrical filaments, which are unevenly distributed but have a self-similar appearance. From Figure 6, it can be found that system (4) has four coexisting attractors. e criteria for the existence of basins of attraction in a dynamic system [25] are as follows: (1) ere is a smooth invariant subspace containing chaotic attractors (2) ere is another asymptotic final state outside the invariant subspace (not necessarily a chaotic state) (3) e lateral Lyapunov exponent of the invariant subspace is negative (4) e lateral stability of the unstable periodic orbit of the attractor is related to the positive finite time change e study in [25] has proved that the coupled Lorenz system satisfies conditions 1 and 2. ere are two invariant (three-dimensional) manifolds in the six-dimensional phase   Complexity space; as the trajectory from each subspace will always remain there, it will evolve toward the respective famous Lorenz attractor. For the synchronous attractor of the coupled Lorenz system, the literature has proposed using the sieve area to describe the synchronous attractor of the coupled Lorenz system. In this paper, we use the finite-time lateral Lyapunov exponent for comparison with the lateral exponent of a specific orbit and obtain results similar to those in [25].
We find that the basins of attraction of the dynamic system have the following properties: (1) e system orbit tends to a fixed point (2) e system is periodic and quasi-periodic (3) e system has chaotic or hyperchaotic behaviors (4) e time series of the system tends to infinity in a finite time e improvement of the associative memory network must overcome a fundamental problem; that is, in addition to the attractors with memory samples, there are also "redundant" stable states (pseudostates). e existence of pseudostates affects the fault tolerance of the associative memory network. If the basins of attraction of pseudostates can be reduced or eliminated, the fault tolerance of the associative memory network can be improved and the memory capacity can be increased.

FPGA Implementation
e hardware experiment of system (4) is conducted by the method of fixed-point number, based on FPGA technology. We use Xilinx Zynq-7000 series XC7Z020 FPGA chip and AN9767 dual-port parallel 14-bit digital-to-analog conversion module with the maximum conversion rate of 125 MHz and adopt Vivado 17.4 and the System Generator to realize the joint debugging of Matlab-FPGA. Besides, we use oscilloscope to visualize the analog output. After the analysis, synthesis, and compilation of Vivado, to further confirm that the chaotic neural network system is correct, after confirming that the timing simulation results are correct, we generate the bit file by Vivado and download the generated 6 Complexity bit file to the FPGA development board, convert the output of FPGA into the analog signal using AN9767 digital-toanalog converter, and then connect AN9767 digital-toanalog converter to the oscilloscope to observe the phase diagram of system (4) attractor. e phase diagrams displayed by the oscilloscope are, respectively, shown in Figure 7.

CNN-Based Optimization Calculation of TSP Problem
Traveling salesman problem (TSP) is a classic topic about combinatorial optimization. In a typical TSP scenario, a salesman has to rush around from one city to another to promote his goods and then goes back to his original city. How should he choose his shortest route through all the cities? According to the Graph eory, this problem is, in essence, to find out a Hamiltonian loop with the lowest weight in a weighted and directless graph. Given that the feasible solution to this problem is the total permutation of all the vertexes, as the vertexes increase, the resulting combinations explode; therefore, it is an NP-completed problem.
e extensive application of this problem in transportation, circuit board, circuit design, and logistics distribution has led to its extensive research among scholars both at home and abroad. Various exact algorithms were used in early research to solve it, for example, branchand-bound technique, linear programming technique, and dynamic programming technique. However, as the problem snowballs in scale, these methods fail to work anymore. erefore, in later research, scholars turn to approximate or heuristic algorithms, mainly including Genetic Algorithm [26,27], Simulated Annealing [28,29], Ant Colony Algorithm [30,31], Tabu Search Algorithm [29,32], Greedy Algorithms [33,34], and neural networks [35]. e chaotic neural network (CNN) herein is a feedback neural network structured similarly to a control system where there is a feedback from an output terminal to the corresponding input one. Upon a given input excitation, the state of the loop will change continuously and the values at both the input and output terminals also do so until being stable. Each output represents a state. erefore, the CNN herein is a power system with multiple inputs and outputs. In a dynamic system, the equilibrium state can be Complexity understood as the one in which the value of a form of energy in a dynamic system continuously decreases to the minimum. e system can be converged in different states by setting different energy functions. First, the problem is mapped onto the CNN. e problem to be mapped can be represented in a commutator matrix. For a case of n cities, the travel route should be represented by an n × n matrix composed of n 2 neurons.
Each row and column in the commutator matrix had only a single element, 1, and all the other elements, 0. Such a matrix can help us uniquely identify the shortest travel route. To have the lowest energy point in the network correspond to the shortest travel route, it is necessary to delicately construct an energy function. According to system (1), an energy function was constructed to solve the traveling salesman problem. Compared with the Hopfield neural network, the chaotic neural network eliminates some shortcomings, such as a lower calculation speed, a troublesome parameter setting, a larger possibility of landing in an invalid solution, and a higher difficulty in identifying the optimal solution. e expression of the energy function is given in formula (5), and the change in the energy function is shown in Figure 7. From the analysis in Figure 8, when the path optimization results are obtained, the final state of the energy function is close to 0.
28 cities are set in this paper. Based on four optimization exercises, we obtain the findings: the optimal energy function is 1.5193, the initial route length was 13.1544, and the shortest route length was 5.3188. e simulation results show that the CNN herein can solve the traveling salesman problem very well. e results are shown in Figure 9.

Conclusion
In this paper, we put forward a new chaotic neural network model and a resulting CNN model. e CNN has rich chaotic dynamic behaviors and can generate multiscroll hidden chaotic attractors. en, we study the dynamic behaviors, including bifurcation behaviors, Lyapunov exponent, Poincaré section, and the basins of attraction, and get knowledge of related characteristics of the basins of attraction. Furthermore, we realize the CNN through FPGA. e experiment proved that theoretical analysis and the FPGA realization led to consistent conclusions. Finally, we constructed an energy function to optimize the calculation based on the CNN, providing a new approach to solving the TSP problem. Since chaotic system and chaotic neural network have been widely used in image encryption [36][37][38][39][40] and secure communication [41][42][43][44], the application of these two aspects will be the focus of our future research.

Data Availability
All data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that they have no conflicts of interest.