Neutrosophic Simulated Annealing Algorithm and Its Application in Operation Optimization in Dangerous Goods Warehouse

In the classical simulated annealing algorithm (SAA), the iteration feasible solution is mainly based on a certain random probability. In the process of iteration, there is a lack of comparison between individuals and the whole population of feasible solutions, and the indicators to measure the change of state of individuals are too absolute to achieve the overall control of the algorithm. To deal with uncertain information that individuals encounter in iteration, this study introduces the idea of neutrosophic decision-making and establishes a kind of neutrosophic fuzzy set (NFS) to describe the time-varying iterative state of individuals according to the change of individual state, the change of population state, and the number of iteration. The biggest feature of this study is to propose a neutrosophic simulated annealing algorithm that combines the idea of NFS with simulated annealing. The biggest contribution of this study is to propose a novel entropy measurement method using the NFS and combine it with the simulated annealing algorithm to handle the optimization process. Finally, the effectiveness of the novel algorithm is verified by an example of warehouse optimization.


Introduction
e simulated annealing algorithm (SAA), which was proposed by Steinbrunn et al. [1], is to minimize a function with the method of simulated annealing. Usually, SAA includes three processes: heating, isothermal, and cooling. SAA has a strong global search ability and does not rely on the knowledge structure of the objective function. Meanwhile, SAA is a classical algorithm that seems blind but has a clear search direction.
In the application process, the quality of SAA depends on a series of parameters, such as temperature, cooling speed, and temperature management. However, in the process of iteration in SAA, there is a lack of contrast between the individuals of the same generation. e iteration of each individual is based on the probability in isolation. e consequence of this defect is that when a large-scale optimization problem is given, in order to achieve the function of probability, we need relatively more individuals, and its number is di cult to lock.
To solve the aforementioned problems, this study orients to fuzzy mathematics, rather than probability, integrates the uncertain information in the iterative process by constructing the neutrosophic fuzzy set (NFS) and then proposes a novel algorithm called neutrosophic simulated annealing algorithm (NSAA). It is noteworthy that NFS is structured by using three functions, i.e., a truth membership function, an indeterminacy membership function, and a falsity membership function. Besides, the classical fuzzy set is a kind of specifical NFS in which the indeterminacy membership function and a falsity membership are not given. For convenience to better narrate the novel ideas, a brief review of relevant literature is given in the following subsection.

Relevant Studies.
In this subsection, an overview of SAA, parameterization, NFS, and entropy function is presented.
In recent years, scholars have applied SAA to many types of research. Ezugwu et al. [2] proposed simulated annealing based symbiotic organisms search optimization algorithm for the traveling salesman problem. Wei et al. [3] proposed a SAA for vehicle routing problems with two-dimensional loading constraints. At the same time, Zhang et al. [4] used SAA to optimize hybrid systems for renewable energy, including batteries and hydrogen storage. In the same year, Zhang et al. [5] proposed simultaneous optimization of nonsharp distillation sequences and heat integration networks by SAA. Afterwards, Leite et al. [6] proposed a novel variant of the SAA, named fast annealing algorithm, for solving time planning problems. Morales-Castanẽda et al. [7] then proposed an improved SAA based on ancient metallurgy techniques, which improved the search capacities of the algorithm while maintaining the computational structure of the SA method.
In terms of parameterization, Jin and Jin [8] introduced a quantum particle swarm optimization algorithm using a series of parameters and used this novel algorithm to optimize software reliability growth performance. Afterwards, Zahoor Raja et al. [9] applied intelligent computing to parameter excitation, vertically driven pendulum, and dusty plasma models of Mathieu's systems. Oh et al. [10] utilized an intelligent algorithm to study the welding parameters control for lap-joint. Li et al. [11] studied the problem of tracking control of networked control systems of the intelligent vehicle with external disturbance and networkinduced disturbance. Huang et al. [12] studied data-driven ontology generation and evolution towards intelligent service in manufacturing systems. e allocation and routing model of rescue vehicles in disaster areas was proposed by Goli and Malmir [13]; in which the demand parameters were considered fuzzy parameters, and a harmony search algorithm with random simulation was developed. Meanwhile, Tirkolaee et al. [14] designed a sustainable mask Closed-Loop Supply Chain Network (CLSCN) during the COVID-19 outbreak and adopted the Taguchi design method to adjust and control parameters.
As for neutrosophic thought, it is noteworthy that Smarandache [15] introduced the Neutrosophic fuzzy set. Smarandache [15] pointed out that "it is a branch of philosophy which studies the origin, nature, and scope of neutralities, as well as their interactions with different ideational spectra". From the philosophical viewpoint, NFS is a kind of comprehensive set, which includes both fuzzy set and intuitionistic fuzzy set, as it considers the indeterminacy function in addition to truth-membership and falsity membership. As a generalization of NFSs, Broumi and Deli [16] investigated the correlation measure for the neutrosophic refined sets. Karaaslan [17] introduced the concept of possibility neutrosophic soft set and possibility neutrosophic soft set operations and proposed some properties on them. Deli [18] studied interval-valued neutrosophic soft sets and decision making. ereafter, Ali et al. [19] introduced a novel neutrosophic recommendation system based on the hybridization of the neutrosophic set for medical diagnosis. Afterwards, Abdel-Basset et al. [20] used a mixed neutral color multicriteria group decision-making method for project selection. Akram et al. [21] proposed a decisionmaking approach based on the maximizing deviation method and TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) to solve the MADM problems. Long et al. [22] proposed a fuzzy clustering algorithm through a neutrosophic association matrix. e experimental results on several benchmark datasets using different clustering criteria showed the advantage of the proposed clustering over the existing algorithms. In the same year, Refaat and El-Henawy [23] proposed an innovative method to evaluate quality management system audit results' using a single value neutrosophic number. Meanwhile, Basha and arwat [24] developed a neutrosophic rule-based prediction system for toxicity effects assessment of biotransformed hepatic drugs. e results of the proposed models indicated that the system could be utilized for the prediction of drug toxicity in the early stages of drug development. Lately, Aydın et al. [25] utilized neutrosophic present worth analysis with interval-valued parameters for energy-investments decision making. It was observed that the proposed method presented big flexibility to experts and it gave effective and efficient results. In the same year, Rashno et al. [26] proposed an effective clustering method based on data indeterminacy in the neutrosophic set domain. Siddique et al. [27] defined the notion of generalized complex neutrosophic graphs of type 1 and discussed certain of their properties, including regularity and completeness. Ahmed et al. [28] developed a new method for solving linear programming problems based on bipolar single-valued neutrosophic sets.
Moreover, some representative researches regarding entropy function have also been reviewed. Sahin [29] defined two kinds of interval neutrosophic cross-entropy based on the extension of fuzzy cross-entropy and single-valued neutrosophic cross-entropy.
ereafter, Ye and Du [30] proposed some entropy measures on interval-valued neutrosophic fuzzy sets by using the relationship between distance and entropy measures. Qin and Wang [31] proposed a kind of axiomatic definition of entropy on singlevalued neutrosophic values and extended the definitions by using some aggregation operators. Meanwhile, Singh [32] proposed a kind of neutrosophic-entropy and used this entropy to solve Medical diagnosis problems. Gafar et al. [33] proposed a model to integrate entropy measures with particle swarm optimization by using a neutrosophic variable. Besides, Studying the relationship between score function and entropy function, this study also pays attention to the latest research results on score function. For more details please see, Farhadinia [34]; Garg [35]; Fakhar et al. [36]; Deniel et al. [37]; Zeng et al. [38]; etc.
By referring to the aforementioned research results, the NFS and SAA are combined organically in this study. For convenience, the objectives and contributions of this study are introduced in the coming subsection.

Objectives and Contributions.
In theory, classical SAA uses probability to guide the evolution of individuals has certain blindness in the process of iteration. To strengthen the management of individuals and enhance the connection between individuals of the same generation, this study proposes NSAA. e core of the proposed NSAA is to deal with multisource uncertain information using fuzzy decision-making technology. e objective of this study is to combine fuzzy technology with SAA. Comparatively speaking, the classical SAA emphasizes randomness and is dominated by probability; the novel NSAA focuses on the fuzziness of the evolutionary effect, which is more suitable for some relatively complex optimization problems and can complement the random probability optimization. Next, for the convenience of description, some basic definitions of SAA and NFS are introduced.

Preliminaries
Simulated Annealing originated from the idea of solid annealing in industrial production. e essence of SAA is to accept a certain degree and a certain number of poor solutions in the process of iteration, which endows SAA with a certain ability to escape from the local optimal solution. e definition of SAA is as follows.
Definition 1. (classical SAA [39]). Suppose that there is an optimal problem O P with a minimize objective function f and a series of constraint condition R. e SAA process starts with a set of initial feasible solutions X 0 , and an initial temperature T 0 . At each iteration t, there is a newly generated set of feasible solutions X t , which is in the generated by X t−1 . For any given , then an acceptance probability p(t) is used to decide whether x t−1,j should be iterated by x t,j . In the initial stage of the iteration, it is thought the temperature is higher, and the probability p(t) is set as a large value. As the iteration goes on, just like the temperature is decreasing in industrial production, the probability p(t) is set as a smaller value. When the iteration goes on to a certain extent, we will no longer accept the poor feasible solution, which means lim n⟶∞ p(t) � 0. In classical SAA, the aforementioned process is realized by Monte Carlo stochastic process. rough iteration after iteration, the optimal solution of O P is obtained as x * . For convenience, denote the maximum number of iterations as T, denote M as population size. Furthermore, a schematic diagram of SAA is shown as Figure 1.
Next, the concept of classical single-valued NFS is introduced. [40]). Assume Y is the space of points with a representative element in it denoted byy. A SVNS A on Y is structured by using a truth membership function T A (y), an indeterminacy membership function I A (y) and a falsity membership function F A (y), where T A (y), I A (y), and F A (y) are all mapping functions from Y to [0, 1],

Definition 2. (SVNS
For convenience, the single-valued neutrosophic element in A is denoted as a � 〈T A , I A , F A 〉. For more details, please see Qin and Wang [31]. In the following, the properties that an entropy on SVNS should follow are introduced. Journal of Mathematics or T A y 1 ≤ T A y 2 ≤ 0.5,

it holds that
for any.
It is noteworthy that Property (iv) is not a sufficient conditionfor eorem1.Forconvenience,onlyproperties(i-iii)are adopted in this study. In the following section, based on the aforementioned concepts and definitions, a novel NSAA is proposed to solve the assignment problem in warehouse operation.

Novel Entropy Measures on SVNS.
eoretically, entropy measures can be structured by their corresponding similarity measures. Assume that A is a SVNS on Y, S(·) is a similarity measure on A.
en, it gets an entropy function Z s � S(A(y), A(y c )). Obviously, Z s satisfies properties (i-iv) in eorem 1. Under the guidance of this idea of functional transformation, a novel entropy measure on SVNS is proposed.

Journal of Mathematics
By simplification, it gets the following: en, it gets Z α (A(y)) � Z α (A(y) C ). (4) Because this property is not a sufficient condition for eorem 1, this issue is not discussed.
In order to further deepen the understanding of neutrosophic information and bring more detailed factors into the definition of entropy, a generalized entropy function is proposed as follows. □ Definition 5. Suppose A(y) is a single-valued neutrosophic element on Y, which is denoted as follows: en, a kind of generalized entropy function on A(y) is denoted as follows: where w 11 , w 12 , w 21

NSAA for the Combinatorial Optimization Problem.
Based on combining NFS and SAA theories, a novel heuristic algorithm is proposed. For convenience, the cost-oriented objective is taken as the research object; i.e., the smaller the objective function value is, the better the feasible solution.

Problem Introduction.
In the production practice in enterprises, there are mainly three attributes to be considered, i.e., risk, efficiency, and cost. Suppose that there are n objects p i (1 ≤ i ≤ n) to be assigned to n locations q j (1 ≤ j ≤ n) which satisfy that each object is designated to exactly one location. For any given three assignments, i.e., (p i , q j ), (p k , q l ), (p s , q t ), they influence each other. Under the aforementioned three attributes, the objective of AP is to minimize the comprehensive cost of the assignment. Denote the objective of the assignment asc � n i,j,k,l,s,t�1 f(x ij , x kl , en, the AP problem is expressed as follows: In theories, when n is large enough, it is difficult to find accurate, feasible solutions for equation (15) in a limited time.
en, the existing heuristics algorithms seek nearoptimal solutions at a low cost, and SAA is one of these heuristic algorithms. e novel NSAA is introduced to solve this AP.

Problem Analysis.
To solve the given problem using classical SAA, one step is to decide whether the newly generated offspring of each generation should be accepted. Accordingly, one of the characteristics of SAA is to accept a certain number of relatively poor solutions according to a certain probability. In particular, the probability is generated by using Monte Carlo stochastic process and decreases gradually as the number of iterations increases.
In practice, the classical SAA can be divided into several stages, and the beginning and end of each stage are affected by a series of parameters, such as the initial temperature parameter, neighborhood parameter of a feasible solution, and acceptance probability parameter of the poor solution, etc. ese parameters are distributed in different stages and isolated from each other. In addition, in the classical SAA, the iteration of feasible solutions belongs to parallel computing, and there is no connection between individuals. e isolation between the above parameters and the isolation of the feasible solution iteration results in the strong dependence of the SAA on the parameters.
From the perspective of information processing, the root of the aforementioned problems lies in the lack of effective integration of multisource information. To solve this problem, this study creatively introduces the theory of NFS into SAA and puts a series of parameters into the framework of NFS. ereafter, a novel entropy function is introduced to control the iteration process in SAA. To illustrate the neutrosophic thought better, the iterative process of individuals; please see Figure 2.
Journal of Mathematics Figure 2 shows that through one iteration, three of any given feasible solutions A n , B n , C n , have moved to A n+1 ′ , B n+1 ′ , C n+1 ′ , respectively. Based on system analysis, this study holds the following two viewpoints: (1) Whether the feasible solutions can be accepted is not only related to whether the corresponding internal energy is reduced but also related to the extent of the increase of internal energy of the entire population.
In the classical SAA, the acceptable critical amplitude of internal energy growth is given by parameter calculation. To reduce the dependence of the simulated annealing on the parameters and increase the connection between individuals, the critical threshold in this study is determined by the increase of the internal energy of each individual, the average internal energy change of the feasible solutions of the current generation, and the number of iterations. In essence, the uncertainty information in the iterative process in the proposed NSAA is mined to reduce the dependence on the parameters. In general, take Figure 2 as an example; large acceptance probabilities should be given to A n+1 ′ , B n+1 ′ and C n+1 ′ . e theoretical basis of this view is as when the difference between the increment of a feasible solution and the average increment of the population is large, whether it is far more than the average increment or lower than the average increment, it is easy to have inflexion point on the curve, that is, to break away from the control of the local optimal solution. (2) In the search process of a feasible solution, compared with the average search effect, when the increase of the objective function value is large, the local search of the corresponding solution should be strengthened to explore the optimal solution nearby. On the contrary, when the increase of the objective function value is small, the algorithm should look at the global search and try to jump out of the narrow search scope.
In practice, the following plans are settled according to the aforementioned two points. When the comprehensive entropy brought by the activities of the solution is small, the neighborhood search radius should be small to make the search more detailed. When the comprehensive entropy brought by the activities of the solution is large, the neighborhood search radius should be increased to search for the whole solution space. Up to this point, the idea of intelligent adjustment is constructed.
As a result, A n and B n should be given a small search radius, respectively. Meanwhile, C n+1 ′ should be given a large research radius.

NSAA.
In this subsection, the uncertain information in simulated annealing is analyzed through the viewpoint of neutrosophic thought. In the framework of NFS, three kinds of uncertain information, i.e., individuals' evolution effect, population's evolution effect, and the temperature has been integrated organically. en, a new NSAA is proposed by combining the initial temperature assignment, probability threshold assignment, and other technologies. By using the proposed NSAA, the introduced model (equation (15)) is solved. Specific steps are as follows: Step 1. Initialization: For any given feasible solution of model 3, denote it as follows: where (i 1 , i 2 , . . . , i N ) is a permutation of vector (1, 2, . . . , N), and i k 1 ≠ i k 2 for any k 1 ≠ k 2 , 1 ≤ k 1 , k 2 ≤ N. For convenience, x is coded as follows: Meanwhile, denote the symbols N, t, M as defined in Section 2. en, M initial feasible solutions are randomly generated and denoted asX 0 � x 01 , x 02 , . . . , x 0M , where x 0j � (i 0j1 , i 0j2 , . . . , i 0jN ).

Random
) as the transformed result.
Step 3. Neutrosophic fuzzy evaluation: By using equation (15), the objective function values for all the solution in X 0 and X 0 ′ are obtained. Denote For any given j � 1, 2, . . . , M. en, it gets the following: en, for any given element x 0j in X 0 , a NFS is obtained as follows: By using equation (3), the entropy of A(x 0j ) is obtained as Z α (A(x 0j )).
Step 5. Iteration: For any given X 0 , by using neighborhood search and threshold comparison, a new population X 1 � x 11 , x 12 , . . . , x 1M , is generated by the following: Step 6. Intelligent control of neighborhood radius: Denote t � t + 1, if t < T, turn to Step 2, and denote the following: ereafter, according to Step 3, X 1 is transformed as X 1 ′ . en, turn to Steps 4 and 5, successively.
Step 7. End: Denote t � t + 1 + 1, if t < T, turn to Step 2; if else, end the procedur, and select the final optimal solution as follows:

Supplement.
In order to use the proposed NSAA model effectively, some explanations are provided in this subsection.
(1) In Step 2, r nj is obtained by using entropy measure on NFSs. For convenience, denote the following: e core of this specific neighborhood search rule is as follows. Firstly, Let t be fixed. For any given x tj , when c(x tj ) − c(x tj )/c(x tj ) is large in Int t , the gradient of the feasible solution is large, the global or local optimal solution is likely to appear near it. erefore, x tj is assigned with a small neighborhood search radius. Especially, to prevent a radius equal to zero, a threshold 2/N is provided here. When c(x tj )− c(x tj )/c(x tj ) is small in Int t , the gradient of the feasible solution is small, the global or local optimal solution is likely not to appear near it. erefore, x tj is assigned with a large neighborhood search radius.
(2) In Step 3, when the iteration number t is fixed, the larger the value | (c(x 0j ) − c(x 0j )/c(x 0j )) − j�1,2, . . . , M(c(x 0j ) − c(x 0j )/M · c(x 0j ))| is, the larger is its corresponding entropy Z α (A(x tj )). rough this property on Z α (·), this study focuses on tracking the region with a large change of function values in the optimization process, and then increase the possibility of finding the optimal solution or escaping from local optimal solution. Besides, In equation (22), the entropy of A(x tj ) decreases with the increase of t (temperature), which means NSAA only accepts the new state of individuals with small energy variation, just like the classical SAA. For more details on NSAA, please refer to Figure 3.

Problem Introduction.
In this section, an AP in a dangerous goods warehouse is introduced to verify the proposed NSAA. Assume that there are 14 inbound points to stocking in and 14 outbound points to stocking out. For convenience, the inbound point is denoted as p i (1 ≤ i ≤ 14), the outbound point is denoted as q j (1 ≤ j ≤ 14). Assume that 3 forklifts are assigned to inbound and outbound dangerous goods. Here, inbound and outbound activities are carried out simultaneously and in the form of a chain. For any given p i and q j , its corresponding chain is denoted L � (l ij ) 14×14 where the row in L corresponds to the inbound point, and the column in L corresponds to the outbound point. e normalized L is as follows: Denote the fixed cost for operation management per hour as C r , denote the normal speed index of the forklifts as τ v , denote the time cost of the operation as C t � C r · τ v − 1 · l ′ , denote the influence factor of variable speed on operation time as τ 1 , denote the influence factor of variable speed on fuel cost as τ 2 , and denote the normal fuel cost index for a forklift to travel as C f . Assume that the problem is in the offseason of warehousing activities. erefore, the cost is the only optimization goal. Of course, the cost here includes the opportunity cost caused by the storage accidents.
en, according to field survey, the comprehensive negative-cost optimization model is obtained as follows: 4.2. Optimization Process Using NSAA. By using the proposed NGA model, the introduced problem is solved. First of all, a series of parameters are provided, i.e., τ v � 3.50, C f � 2.40; τ 1 � 0.05, τ 2 � 0.05, τ 3 � 0.40. en, by using the proposed PSAA, the introduced problem is solved. Specific steps are as follows.
Step 2. For any given feasible solution x 0j (j � 1, 2, . . . , M) in X 0 , generate an initial radius r 0j (r tj , t � 0) randomly for x 0j to search new solution. en, Iv 0j1 and Iv 0j2 are generated randomly. By using Iv 0j1 and Iv 0j2 , x 0j evolves as x 0j ′ , X 0 is transformed as X 0 ′ .
Step 3. By using equation (15), the objective function values for X 0 and X 0 ′ are obtained. For any x 0j , by using equation (22), its corresponding NFS is obtained A(x 0j ) By using equation (3), the entropy of A(x 0j ) is obtained as Z α (A(x 0j )).
Step 6. Denote t � t + 1, if t < 500, turn to Step 2, and value r nj for x nj by using equation (24). ereafter, according to Step 3, X 1 is transformed as X 1 ′ . en, turn to Steps 4 and 5, successively.
Step 7. End. Denote t � t + 1 + 1, if t < 500, turn to Step 2; if else, end the procedure, and select the final optimal solution as follows.

Comparative
Analysis. e purpose of this study is to introduce the neutrosophic decision-making idea into the heuristic algorithm. In essence, the existing heuristic algorithms are interlinked. For convenience, this study selects the simulated annealing algorithm as a sample and combines the neutrosophic decision-making idea with it, which has achieved good results. Similarly, neutrosophic decisionmaking ideas can be combined with other heuristic algorithms. Due to its similar nature, this study does not give a specific introduction.
In this section, based on example calculation, the NSAA and classical SAA are compared. By comparison, Some peculiar characteristics of NSAA are described specifically.
(1) In NSAA, the new population is generated by Neighborhood search and threshold comparison. For the threshold comparison, we used the entropy of the ensemble SVNS to compare with an exponentially decreasing function with the number of iterations as a parameter. At first, to prevent the algorithm from entering the local optimal solution, the value of the decreasing function is larger, and the algorithm will accept more values of the fitness of the children than that of the parents. As the decreasing function gradually decreases, the algorithm will accept the lesser fitness value in the children and parents. Moreover, Figure 4 shows this difference. In Figure 4, it is shown that, at the initial stage of the iteration, there are few contact points between the local minimum fitness value and the global minimum fitness value, but with the increase in the number of iterations, there are more and more contact points between the local minimum fitness value and the global minimum fitness value, indicating that the children in the later stage of the iteration will produce individuals with smaller fitness values many times, which is more likely to find the optimal solution. (2) In Figure 5, the performances of NSAA and classical SAA are compared. For example, in one classical SAA algorithm, the acceptance probability is denoted as follows.
When the fitness value of the child is greater than that of the parent, the child will be accepted with this probability of p.
To overcome the influence of randomness, the NSAA and the SAA have been run 20 times, respectively.

Journal of Mathematics
Furthermore, the optimization results are shown in Figure 5, where the blue dots represent the minimum fitness values obtained by using NSAA, and the blue line is the average result of the minimum fitness values by using NSAA by 20 times; whereas the red dots represent the minimum fitness values obtained by using SAA, and the red line is the average result of the minimum fitness values by using SAA by 20 times. Figure 5 shows that the results obtained by using the novel NSAA are better than by using the classical one. Especially, this order relation is established in the sense of probability.

Conclusion
In this study, the classical Monte Carlo stochastic process has been replaced by entropy calculation in the framework of NFS. And then, three major types of uncertain information in SAA have been integrated. Further, a novel NSAA is proposed, and an example is given to illustrate the effectiveness of the proposed NSAA. By using the illustrative example, it gets the main characteristics of the proposed NASS, which is concluded as follows.
Firstly, the proposed NSAA takes the evolution effect of the representative individual as subject information, takes the overall evolution effect of the population as the benchmark, i.e., objective information, and takes the reciprocal of the number of iterations as nonstructural information. Finally, the multisource information is perfectly integrated as the neutrosophic fuzzy element.
Secondly, compared with classical SAA, the proposed NSAA is based on fuzzy calculation rather than probability. With the advantages of fuzzy mathematics, the NSAA is more suitable to describe the fuzziness of the iteration effect, which is a breakthrough in optimization theory.
irdly, parameterized threshold function is settled in NSAA. By using the time-varying function of the threshold, it realizes the detailed control of the iterative direction.
Fourthly, the advantage of the classical SAA is easy to escape from the local optimal solution. However, there is blindness in the process of escape. Comparatively, the relative relationship between the individual and its corresponding population is considered in NSAA, which can increase the optimization effect of simulated annealing.
Finally, by using an optimal example in warehouse management, the effectiveness of the proposed NSAA is verified.

Data Availability
All of the data used are included in the proposed manuscript.

Conflicts of Interest
e authors declare that they have no conflicts of interest.

Authors' Contributions
Fangwei Zhang conceptualized the study, developed methodology, and wrote the original draft. Zhenrui Chen provided software and validated and investigated the study. Jun Ye reviewed and validated the study. Bing Han edited and validated the study.