Computing the Entropy Measures for the Line Graphs of Some Chemical Networks

Chemical Graph entropy plays a significant role to measure the complexity of chemical structures. It has explicit chemical uses in chemistry, biology, and information sciences. A molecular structure of a compound consists of many atoms. Especially, the hydrocarbons is a chemical compound that consists of carbon and hydrogen atoms. In this article, we discussed the concept of subdivision of chemical graphs and their corresponding line chemical graphs. More preciously, we discuss the properties of chemical graph entropies and then constructed the chemical structures namely triangular benzenoid, hexagonal parallelogram, and zigzag edge coronoid fused with starphene. Also, we estimated the degree-based entropies with the help of line graphs of the subdivision of above mentioned chemical graphs.


Introduction
Mathematical chemistry is a field of theoretical chemistry that uses mathematical approaches to discuss molecular structure without necessarily referring to quantum mechanics [1]. Chemical Graph eory is a branch of mathematical chemistry where a chemical phenomenon is theoretically described using graph theory [2,3]. e growth of organic disciplines has been aided by Chemical Graph eory [4,5]. In mathematical chemistry, graph invariants or topological indices are numeric quantities that describe various essential features of organic components and are produced from an analogous molecular graph [6,7]. Degree-based indices are among the topological indices used to predict bioactivity, boiling point, draining energy, stability, and physico-chemical properties of certain chemical compounds [8,9]. Due to their chemical applications, these indices have significant role in theoretical chemistry. Zhang et al. [10][11][12] discuss the topological indices of generalized bridge molecular graphs, Carbon Nanotubes and product of chemical graphs. Zhang et al. [13][14][15] provided the physical analysis of heat for formation and entropy of Ceria Oxide. For further study about indices, see [16,17]. Shannon [18] originated the conception of information entropy in communication theory. However, it was later discovered as a quantity that applied to all things with a set nature [19,20], including molecular graphs [21][22][23]. In chemistry, information entropy is now used in two modes. Firstly, it is a structural descriptor for assessing the complexity of chemical structures [24]. Information entropy is useful in this regard for connecting structural and physico-chemical features [25], numerically distinguishing isomers of organic molecules [26], and classifying natural products and synthetic chemicals [27,28]. e physico-chemical sounding of information entropy is a different mode of application. As a result, Terenteva and Kobozev demonstrated its utility in analyzing physico-chemical processes that simulate information transmission [29]. Zhdanov [30] used entropy values to study organic compound chemical processes. e information entropy is defined as: Here, the logarithm is considered to be with base e while F V , F E and Λ(lm) represent the vertex set, the edge set and the edge weight of the edge (lm) in Λ. Many graph entropies have been calculated in the literature utilising characteristic polynomials, vertices degree, and graph order [31][32][33][34]. Graph entropies, which are based on independent sets, matchings, and the degree of vertices [35], have been estimated in recent years. Dehmer and Mowshowits proposed several graph complexity and Hosoya entropy relationships [23,32,36,37]. For further study, see [19, 21, 38-42, 59, 60]. e graph F is structured into ordered pairs, with one object being referred to as a vertex set (F V ) and the other as an edge set (F E ), and these vertices and edges being connected. When two vertices of F share an edge, they are said to be neighboring. e sum of the degrees of all neighboring vertices of l is denoted by A l , and the degree of a vertex l is represented by ℵ(l). By replacing each of S(F)'s edges with a path of length two, the subdivision graph S(F) is formed. e line graph is denoted by the symbol L(F) in which |V(L(F))| � |E(F)| and two vertices of L(F) are adjacent iff their corresponding edges share a common end points in F. [43,44].

Atom Bond Connectivity Entropy
us (1) is converted in the following form: 1.3. e Geometric Arithmetic Entropy [43,44]. If Now (1) takes the form as given below.
Equation (1) is now changed to the following form, which is known as fifth geometric arithmetic entropy. 2 Computational Intelligence and Neuroscience See [35,44] for further information on these entropy measures.

Formation of Triangular
Benzenoid T x 8x ∈ N Triangular benzenoids are a group of benzenoid molecular graphs and are denoted by T x , where x characterizes the number of hexagons at the bottom of the graph and 1/2x(x + 1) represents the total number of hexagons in T x . Triangular benzenoids are a generalization of the benzene molecule C 6 H 6 , with benzene rings forming a triangular shape. In physics, chemistry, and nanosciences, the benzene molecule is a common molecule. Synthesizing aromatic chemicals is quite fruitful [46]. Raut [47] calculated some toplogical indices for the triangular benzenoid system. Hussain et al. [48] discussed the irregularity determinants of some benzenoid systems.
Let F � L(S(T x )). i-e. F is the line graph of the subdivision graph of triangular benzenoid T x . We will use the edge partition and vertices counting technique to compute our abstracted indices and entropies. e degree of each edge's terminal vertices is used in the edge partitioning of F. It is easy to see that there are only three types of edges shown in Table 1.

Entropy Measure for L(S(T x )).
We'll calculate the entropies of F � L(S(T x )) in this section.

e ABC Entropy of L(S(T x )).
e ABC index and entropy measure with the help of Table 1 and equation (5) is:

e Geometric Arithmetic Entropy of L(S(T x )
). e GA index and entropy measure with the help of Table 1 and equation (7) is:  e edge partition of the graph L(S(T x )) is grounded on the degree addition of terminal vertices of every edge, as shown in Table 2.

2.1.5.
e GA 5 Entropy of L(S(T x )). After some simple calculations, the GA 5 index may be calculated using Table 2 under the constraint that x ≠ 1.

Formation of Hexagonal Parallelogram
Nanotubes H(x, y), 8x, y ∈ N Hexagonal parallelogram nanotubes are formed by arranging hexagons in a parallelogram fashion. Baig et al. [52] computed counting polynomials of benzoid carbon nanotubes. Also, see [53]. We will denote this structure by H(x, y)∀x, y ∈ N, in which x and y represent the quantity of hexagons in any row and column respectively. Also, the order and size of H(x, y) is 2(x + y + xy) and 3xy + 2x + 2y − 1 respectively. e subdivision graph of H(x, y) and its line graph is shown in Figure 2, see [46]. Let F � L(S (H(x, y))), then |F V | � 2(3xy + 2x + 2y − 1) and |F E | � 9xy + 4x + 4y − 5. To compute our results, we will use edge partition technique which is grounded on the degree of terminal vertices of every edge. It is to be noted that there are only three types of edges, see Figure 2. e edge partition of chemical graph L(S (H(x, y))) depending on the degree of terminal vertices is presented in Table 3. Table 2: Edge partition of L(S(T x )).
Computational Intelligence and Neuroscience
(ℵ(l), ℵ(m)) N i Kinds of Edges Computational Intelligence and Neuroscience Table 3 and equation (5), we can calculate the ABC index and entropy measure as follows:

e ABC Entropy of F. With the use of
erefore, the equation (5), with Table 3 becomes as following and is called the atom bond connectivity entropy.
3.1.3. e Geometric Arithmetic Entropy of F. We can calculate the GA index and entropy measure using Table 3 and equation (7) as follows:  H(x, y))) is shown in Table 4. erefore, the ABC 4 index and entropy measure with the help of Table 4 and equation (9) yield as: · (x + y) + 2 Since F has seven kinds of edges, So (9) by using Table 4 is converted in the form: Case 2. when x � 1, y ≠ 1 By using the same process, we get the closed expressions for the ABC 4 index and ABC 4 entropy as:  H(x, y))).
Computational Intelligence and Neuroscience 7

e Fifth Geometric Arithmetic Entropy of F
Case 3. when x > 1, y ≠ 1 e fifth geometric arithmetic entropy can be estimated by using (11), and Table 4 in the following manner: So the (11), with Table 4 can be written as: Case 4. when x � 1, y ≠ 1By using Table 5 and using (11) we get the closed expressions for the GA 5 index and GA 5 entropy as:

Formation from Fusion of Zigzag-Edge Coronoid with Starphene ZCS(x, y, z) Nanotubes
If a zigzag-edge coronoid ZC(x, y, z) is fused with a starphene St(x, y, z), then we will obtain a composite benzenoid. It is to b noted that |V(ZCS(x, y, z))| � 36x − 54 and |E(ZCS(x, y, z))| � −63 + 15(z + y + x). e subdivision graph of ZCS(x, y, z) and its line graph are illustrated in Figure 3. We can see from figures that the order and the size in the line graph of the subdivision graph of ZCS(x, y, z) are −126 + 30(z + y + x) and −153 + 39(z + y + x) respectively [46]. Let F represents the subdivision graph of ZCS(x, y, z)'s line graph. e edge division is determined by the degree of each edge's terminal vertices. Table 6 illustrates this. (ZCS(x, y, z))). We'll calculate the entropies of F � L(S (ZCS(x, y, z))) in this section.
(ℵ(l), ℵ(m)) N i Kinds of Edges Computational Intelligence and Neuroscience

e ABC Entropy of F.
e ABC index and entropy measure with the help of Table 6 and equation (5) are: 3. e Geometric Arithmetic Entropy of F. e GA index and corresponding entropy with the help of Table 6 and equation (7) are: 4.1.4. e ABC 4 entropy of F. Table 7 shows the graph L(S (ZCS(x, y, z)))'s edge partition, which is based on the degree addition of each edge's terminal vertices.

Concluding remarks for Computed Results
e applications of information-theoretic framework in many disciplines of study, such as biology, physics, engineering, and social sciences, have grown exponentially in the recent two decades. is phenomenal increase has been particularly impressive in the fields of soft computing, molecular biology, and information technology. As a result, the scientists may find our numerical and graphical results useful [54,55]. e entropy function is monotonic, which means that as the size of a chemical structure increases, so does the entropy measure, and as the entropy of a system increases, so does the uncertainty regarding its reaction.
For L(S(T x )), the numerical and graphical results are shown in Tables 8 and 9 and Figures 4-7. In Table 9, the fifth arithmetic geometric entropy is zero which shows that the process is deterministic for x � 1. When the chemical  Computational Intelligence and Neuroscience structure L(S(T x )) expands, the Randic ′ entropy for α � 1/2 develops more quickly than other entropy measurements of L(S(T x )), whereas the Randic ′ entropy for α � −1/2 develops more slowly. is demonstrates that different topologies have varied entropy characteristics. For L(S (H(x, y))), the numerical and graphical results are shown in Tables 10-13 and Figures 8-12. When the chemical structure L(S (H(x, y))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S (H(x, y))), whereas the ABC 4 entropy develops more slowly. Finally, for L(S (ZCS(x, y, z))), the numerical and graphical results are shown in Table 14 and Figures 13-16. When the chemical structure L(S (ZCS(x, y, z))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S (ZCS(x, y, z))), whereas the Randic ′ entropy for α � −1 develops more slowly. e novelty of this article is that entropies are computed for three types of benzenoid systems. ese entropy measures are useful in estimating the heat of formation and many Physico-chemical properties. In statistical analysis of benzene structures, entropy measures showed more significant results as compared to topological indices. erefore, we can say that the entropy measure is a newly introduced topological descriptor.

Conclusion
Using Shanon's entropy and Chen et al. [31] entropy definitions, we generated graph entropies associated to a new information function in this research. Between indices and information entropies, a relationship is created. Using the line graph of the subdivision of these graphs, we estimated the entropies for triangular benzenoids T x , hexagonal parallelogram H(x, y) nanotubes, and ZCS(x, y, z). ermodynamic entropy of enzyme-substrate complexions [57,58] and configuration entropy of glass-forming liquids [56] are two examples of thermodynamic entropy employed in molecular dynamics studies of complex chemical systems. Similarly, using information entropy as a crucial structural criterion could be a new step in this direction.

Data Availability
e data used to support the findings of this study are cited at relevant places within the text as references.

Conflicts of Interest
e authors declare that they have no conflicts of interest.