A Study on the 3D Hopfield Neural Network Model via Nonlocal Atangana–Baleanu Operators

,


Introduction
Neural networks (NNs) are a part of machine learning that are at the centre of deep learning techniques. eir identity and dynamics are taken from the human brain, and they dovetail the path real neurons transfuse to each other. In some branches of artificial intelligence (AI), deep learning, and machine learning, NNs are mimetic to the function of the human brain, helping computer algorithms to locate patterns and estimate general problems. As a result of their widespread use in a variety of sectors, NNs have elicited a great deal of anxiety [1][2][3]. Practice data is utilized by neural networks to swot and improve their performance over time. However, when these learning tactics have been improved for precision, they get as the knotty features in computer science and AI, helping us to hastily classify and hoard data. In comparison to manual recognition by human experts, actions in speech or picture identification can take minutes rather than hours. Various types of NNs are present, each of which is used for a specific target.
For the first time in 1984, Hopfield introduced the Hopfield neural network (HNN) [4]. Since then, a greater learning of the Hopfield neural network's dynamical behaviour has been crucial in the study of applications of engineering and information processing, such as pattern identification [5], signal processing, and associative memory [6]. Moreover, there have been several studies published in the literature on the dynamical characteristics of a range of complex-valued neural network models. HNN, as previously said, is an artificial model derived from brain dynamics, and it is an important model in neurocomputing [7]. A neural model like this is capable of accumulating information or material in an identical fashion to a human brain. Njitacke et al. in [8] discussed the space magnetization, hysteretic dynamics, and offset boosting in a third-order memristive system. In [9], the authors analyzed the complex structure of a 3D autonomous system without linear terms having line of equilibria. A study on the control of multistability with selection of chaotic attractor along with an application to image encryption is given in [10]. In [11], a dynamical analysis on a simple autonomous jerk system with multiple attractors is proposed.
Nowadays, fractional-order operators are highly useful to solve varieties of real-world problems [12][13][14]. e main feature of fractional derivatives is their nonlocal properties which help to capture memory effects in the systems. ese operators are an advanced version of the integer-order operators. To date, fractional operators have been used in various scientific and engineering fields by using different kinds of mathematical modelings. Recently, fractional derivatives have been used in disease dynamics [15,16], mechanics [17], psychology [18], engineering [19], advanced modeling via fractal-fractional operators [20], etc. For the sake of the various advantages of fractional operators for memory effects, modeling dynamic systems using fractional calculus has been met with scepticism [21,22]. In [23], the explicit stability dependency on a variable time delay was presented and delay-dependent stability switches of linear systems of the fractional type were examined. is theory has recently been included into NNs, resulting in fractionalorder neural networks (FONNs). So such a medication can strengthen the ability of neurons to process information. Fortunately, owing to the unwavering tenacity of researchers, various worldwide applications of FONNs have been discovered, including network approximation [24], state estimation [25], system identification [26], robotic manipulators [27], and formation control [28]. For neurons, fractional-order elements have two clear benefits. On the one hand, fractional calculus, as compared to ordinary calculus, has a far better depiction of memory and hereditary features [29]. Fractional-order parameters, on the other hand, can improve system performance by adding one degree of freedom [30]. By combining memory peculiarity into NNs, there is clearly an enormous improvement. FONNs have produced some astonishing effects [31,32].
In this article, we perform some novel mathematical simulations on the dynamical model of 3D HNNs which was investigated in ref. [33] given as follows: where terms x 1 , x 2 , x 3 are state variables and β 1 , β 2 , β 3 stand for the variable gradient in relation to the activation function. Firstly, we give some preliminaries related to the fractional calculus in Section 2. en, to solve the above given dynamical model (1), we generalise the model into Atangana-Baleanu (AB) fractional derivative under the Mittag-Leffler kernel in Section 3. For investigating the numerical solution of the fractional-order model, we apply Predictor-Corrector (PC) method. In Section 4, a number of graphs are plotted to check the correctness of the derived solution. Lastly, we give the supporting conclusion.

Preliminaries
Several important notions are recalled here.

The Structure of the Model
Here we generalise the aforementioned integer-order model (1) into the fractional-order sense by using a nonsingular type fractional derivative called Atangana-Baleanu fractional derivative. Nonlocal characteristic of the AB derivative contains the memory in the system which is the main motivation behind this generalization. So, the fractional form of the given system (1) in the AB-operator sense is given by where AB D c t is the AB fractional derivative of order c.

Derivation of the Numerical Solution.
In the current literature, there are many computational methods available to solve different types of fractional-order systems. Some very recent works on the proposal of numerical methods in the sense of fractional derivatives can be seen from ref. [36,37]. Here we implement the Predictor-Corrector method for solving the given dynamical model (7). e complete methodology of the proposed method has been defined in ref. [38]. Firstly, we consider the initial value problem (IVP) From ref. [38], the equivalent Volterra integral equation is written by According to the derivation of the method proposed in [38] for the fractional-order c ∈ [0, 1], 0 ≤ t ≤ T and considering h � (T/N) and t n � nh, for n � 0, 1, . . . , N ∈ Z + , the corrector term for the IVP (8) is derived by where and a r+1,r+1 � 1 + ((1 − c)Γ(c + 2)/ch c ). e predictor term is given by where

Complexity
We can see that our proposed model (7) is just a generalized form of the considered IVP (8). Hence the corrector formulae in relation to the proposed model (7) are given by where

Stability of the Scheme
Theorem 1.

Graphical Observations
Now to check the role of the proposed Atangana-Baleanu fractional derivative, we plot a number of graphs with the help of the above mentioned numerical method. e values of the parameters β 1 and β 3 are fixed and equal to β 1 � 0.9, β 3 � 1.4. For the activation gradient of the second neuron β 2 � 1.15, we plotted the coexistence of four distinct stable states in the group of Figures 1-4. In the frame of Figure 1, the initial values are taken as    x 1 (0) � − 2, x 2 (0) � x 3 (0) � 0. Here we can see that the proposed results are slightly different to the previously given results of ref. [33]. When we change the values of c, the nature of the assumed multiple attractors also changes. One of the main differences in the proposed fractional-order analysis and the previously performed results of [33] is that there is no existence of any perfect periodic attractors at any fractional-order values, but the chaotic attractors are achieved in much better form. All simulations are performed by using Mathematica software.
In the same line when β 2 � 1.18, we consider the coexistence of six different stable states in the group of  Figure 6, we take x 1 (0) � 1, x 2 (0) � x 3 (0) � 0. In the case of Figure 7, these are x 1 (0) � 0.5, x 2 (0) � x 3 (0) � 0 and for Figure 8 are fixed as x 1 (0) � − 0.5, x 2 (0) � x 3 (0) � 0. en for Figure 9, values are x 1 (0) � 1, x 2 (0) � x 3 (0) � 0 and for Figure 10 are Here again we notice that the proposed results are different to the previously given results of ref. [33]. Again, the main difference in the proposed fractional-order analysis and the previously performed results of [33] is that there is no existence of any periodic attractors at any fractional-order values, but the chaotic attractors are achieved in much better form.

Conclusions
In this paper, we simulated a dynamical model of 3D HNNs in terms of Atangana-Baleanu fractional derivative. e numerical solution of the suggested dynamical model has derived via the Predictor-Corrector method. A number of cases for initial values are considered for the better understanding of the role of initial changes. By using the two different values of the second activation gradient of the neuron, the behaviour of the proposed model is investigated at four different fractional orders. From the given graphical simulations, we conclude that in the case of fractional-order values there is no clear existence of any periodic attractors, but the chaotic attractors are achieved in much better form. In the future, the proposed dynamical model can be further solved by using any other fractional operators.

Data Availability
Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.

Conflicts of Interest
e authors declare that they have no competing interests.

Authors' Contributions
e authors declare that the study was realized in collaboration with equal responsibility. All authors read and approved the final manuscript.