Philosophical Analysis of the Meaning and Nature of Entropy and Negative Entropy Theories

)e interpretation of entropy and negative entropy theories on the nature of information is the first scientific paradigm on the interpretation of the nature of information at the beginning of the rise of contemporary information science. )e information entropy or negative entropy theories are the relative measurement for the structuralization degree of the system at a specific level, which have certain characteristics of relativity, functionality, and quantification. Although the concepts of entropy, negative entropy, information, entropy increase, and entropy decrease often have very different specific meanings in different disciplines and for different scientists, the meanings of these concepts are unified in essence and the differences between them are generated when the same kind of concepts is applied to the study of different directions of the same kind of phenomena: either to the static structuralization degree of the system or to the change of the dynamic structuralization degree of the system. Since entropy and negative entropy theories only measure the relationship of structural differences at a specific level of the system from a formal aspect, they are not aimed at the information itself but rather they are aimed at only the structuralization characteristics of the information carrier itself at a specific level. Because of this, it is impossible to deduce the general nature of information directly from such theories.


Introduction
Starting from the works of Claude Elwood Shannon ( [1] (pages 379-423)) and Wiener [2] in the middle of the 20th century, the question of what the information is has always been a major theoretical issue concerned by philosophers and scientists. According to some relevant statistics, so far, in the fields of information science, systems science, self-organization theory, complexity theory [3], physics, life science, numerous related interdisciplinary science, and technology disciplines [4], as well as related theories of different philosophical schools, there are no less than a hundred standard and nonstandard formulations on the nature of information put forward from different levels and perspectives, but no universally accepted explanation has yet been found.
In truth, if we categorize the existing information concepts into different levels, clarify their origins, and identify their true meanings, then the information concepts with different opinions on the surface will appear less complicated and diverse.
On this basis, it is not very difficult to reveal the general nature of information.
is paper is just one of a series of papers we are trying to deal with. Its content only specifically analyzes the meaning and nature of the entropy and negative entropy theories related to the discussion of the nature of information. e interpretation of entropy and negative entropy theories on the nature of information is the first scientific paradigm on the interpretation of the nature of information at the beginning of the rise of contemporary information science, which started in the 1940s and has developed to this day. Based on the basic ideas and methods of this scientific paradigm, a lot of theoretical results have been produced, and a series of technical and applied achievements have been produced.
For example, Arellano Valle et al. [5] have applied the research method of generalized skew-normal (GSN) negentropy to the complexity research of fish condition factor time series [5].
Since this interpretation method is proposed in the context of the mutual entanglement and correspondence between the two concepts of entropy and negative entropy, it is necessary to start with the interpretation of the two concepts of entropy and negative entropy and their related theories in order to clarify the specific meaning and nature of this interpretation method.

Entropy Theory
At the beginning, the concept of entropy was proposed in the interpretation of the second law of thermodynamics. At that time, the concept of entropy was not linked to the concept of information.

Clausius Entropy and "Principle of Entropy Increase".
In 1850, the German physicist Rudolf Clausius proposed the second law of thermodynamics. In 1865, he put forward the concept of "entropy" and accordingly expressed the second law of thermodynamics as "the principle of entropy increase." Since Clausius only grasped the concept of entropy in the sense of "transformation" and "change" (it can be seen from the intentional choice of the Greek word τρoπη �, "trope," meaning "transformation," and add the prefix ε] to correspond to the word energy and constitute "Entropia" with the same meaning as "transformation content" [6]; German: Entropie), he only pointed out the "entropy increase phenomenon," and the state function of entropy can determine the direction and limitation of the physical process in the macro state. However, the absolute value of the entropy of a physical system and the more general creative meaning and value of entropy were not clearly defined and explained by Clausius, which made the Clausius "entropy" a little mysterious and speculative. According to this, the academic circles also metaphorically call Clausius entropy "Clausius demon" or "entropy demon."

Boltzmann's Statistical Physical Entropy.
In 1877, the Austrian physicist Ludwig Boltzmann made a probabilistic explanation of the physical meaning of entropy and the principle of entropy increase using statistical methods from the perspective of molecular kinematics ( [7] (page 34)). He pointed out that an isolated system must evolve from a macro state containing a small number of micro states to a macro state containing a large number of micro states; it must inevitably evolve from a state with an uneven probability distribution of each micro state to a state with an even probability distribution of each micro state. e outstanding feature of Boltzmann's work is not only that it introduces a probabilistic method, which provides a feasible solution for the calculation of the absolute value of the system entropy, but also that it reveals the general creative significance and value of the concept of entropy through this calculation. at is to say, what entropy describes is not the existing mode and state of the general mass and energy of the system but the mode and state of the organization, matching, and distribution of these mass and energy.
is links the entropy with the fabric order and order structure of system elements. For a thermodynamic system, entropy is a measurement of the way of distribution, fabric order, and structural arrangement state of thermodynamic molecules in the system. Likewise, a change in entropy (entropy increase or decrease) implies a change in the mode and status of the composition, matching, and distribution of system elements. Boltzmann's work not only unveiled the mystery of the concept of entropy and the principle of entropy increase but also revealed that it is the introduction of the concept of entropy that ushered the scientific vision towards the shift from the study of the mass and energy of general objects to the study of structure, relationship, and evolution direction of general objects. In addition, Boltzmann's work also provided a scientific basis for the generalized development of the entropy concept and the entropy theory. If the system elements mean no longer just a molecule in the sense of thermodynamics, then the structure and relationship of system elements and the entropy representing it can obtain a more universal character. is is why the concept of entropy has remained attractive for more than a hundred years and has been linked to concepts such as "information" and "negative entropy." It has penetrated widely, spanned many disciplines, and triggered the formation of a series of emerging, marginal, interdisciplinary, and comprehensive disciplines.
Boltzmann's statistical entropy clearly overcomes many limitations of Clausius entropy and accordingly presents three advantages: First, it can measure the entropy value of the system itself; second, it directly correlates the entropy value with the number of micro states of the system and their respective probability of occurrence, rather than indirectly measuring the entropy change of system through the intermediate links of heat changes and effects on the system proposed by Clausius entropy. e second one also leads to a third advantage; that is, Boltzmann entropy inherently contains the measurement of the system entropy value formulated by factors other than heat. Because of these three advantages, Boltzmann entropy can be easily extended to generalized entropy. As long as the Boltzmann entropy is not restricted to the fabric mode of the molecular system, it can obtain the broadest universal character.
Since the value of entropy is directly related to the number of micro states of the system and their respective probability of occurrence, this means that the more micro states maintaining the macro state of the system, the more average the probability of occurrence of each micro state, and the greater the entropy of the system. e greater the entropy of system is, the greater the degree of disorder, chaos, and uncertainty of the system structure will be. us, in the most general sense, the entropy value is regarded as a measure of the disorder, chaos, and uncertainty of the system structure.

Shannon's "Entropy of Information Sources".
e relevant theory that directly relates the two concepts of entropy and information was originated from Shannon's "entropy of information sources" theory, which was proposed more than 2 Complexity 80 years later after the emergence of Clausius' principle of entropy increase. Shannon's communication information theory was founded in 1948 under the direct enlightenment of Boltzmann statistical entropy theory. Shannon used two things from Boltzmann's statistical entropy theory: one is the statistical method and the other is the entropy formula. In Shannon's view, an information source is a set system capable of generating a set of random messages with their own probabilities of occurrence. Based on this, a mathematical formula for measuring the information quantity generated by the information source was proposed to be titled as "entropy of the information source" ([8] (page 8)). Shannon's information theory is actually a theory of information entropy, which can also be seen as the theory of entropy.
In fact, the calculation of Shannon's information as the calculation of physical entropy reveals the mutual formulations and relations between certain macro and micro states. If we consider the aggregate of all the messages that a source may send as a macro performance of the characteristics of the source, then each message event sent by the source constitutes a micro performance corresponding to this macro performance. It is the comprehensive effect of all these micro performances that reflects the macro characteristics of the source. If every distribution of molecules in the physical system is regarded as a possible message event, then entropy becomes a measure of information in the sense of Shannon. is inherent unity of physical entropy and information entropy illustrates the truth that both entropy and information quantity are measures of a certain state of a material system itself, the popularization of the concept of physical entropy in the field of communication is the uncertainty of the information source, and the embodiment of the concept of the information source uncertainty in the molecular physics system is the physical entropy.
In fact, Shannon himself is very clear about the true meaning of the information quantity formula he put forward. He emphasized: " e quantity H � −ΣP i Log P i . . . plays an important role in information theory as a measure of information, choice, and uncertainty. e formula of H is the same as the so-called formula of entropy in statistical mechanics" ([8] (page 7)). We noticed that many scholars always quoted these two sentences when evaluating the nature of Shannon's information quantity: "Information is something used to eliminate random uncertainty"; "Information is eliminated uncertainty." It is pointed out that both of these sentences were spoken by Shannon himself in his article "A Mathematical eory of Communication." ese two sentences have now become the classic definition of Shannon's information quantity. However, after verbatim verification, we did not find such a discourse in Shannon's "A Mathematical eory of Communication" [1]. On the contrary, what we found was that he repeatedly emphasized that what his information quantity measures is the uncertainty of the information generated by the information source, and it is the "entropy of the information source" [1].
In fact, the relationship between "uncertainty" and "information" in these two sentences can be traced back to the empiricist school: Locke [9], a British philosopher and thinker, and Hume [10], a British philosopher. Hume once made it clear that more information can be provided by choosing from the greater possibilities. It can be said that this is the source of inspiration for the two sentences above.
Warren Weaver (1894-1978), a well-known American scientist and social activist, wrote about Shannon's information quantity with detailed comments in the book " e Mathematical eory of Communication," [11] coauthored with Shannon in 1949. He emphasized three levels of communication issues: technical issues, semantic issues, and utility issues. He believed that although Shannon's information quantity focuses only on technical issues, this does not mean that the engineering and technical aspects of communication have nothing to do with semantic issues and utility issues. In his related comments, he particularly emphasized that "information is a measure of ones' freedom of choice in selecting a message" ( [12] (pages 614-616, 619)). In this way, according to Weaver's evaluation, the interpretation of information quantity in communications cannot be merely limited to the "entropy of the information source," as it can also be related to issues of meaning and utility, as well as the subjective activities of the person's selection and reception of information. From this, we noticed that Shannon emphasized that his information quantity is "a measure of information, choice and uncertainty" and is "how much 'the possibility of choice' is involved in the choice of events" or finding a way to measure how much the uncertainty of the result of choices is ([8] (page 7)). e term "choice" has been used already in his theory. Any kind of "choice" cannot be a purely objective activity of information source itself and cannot be separated from the corresponding activities of the person as the subject. In this regard, in the stipulation of Shannon's information quantity, there are inevitably some factors such as the meaning of the message and the influence for the receiver. Inferring from this, the argument that Shannon's information quantity "is used to eliminate random uncertainty" and "is the eliminated uncertainty" is not completely false, but if these are imposed on him without a second thought and differentiation, it may sabotage his original intention.
In addition, we also noticed that, in Shannon's information theory, the quantitative calculation of information quantity is based on the difference between the probability that a message actually occurs and the probability that it may occur; that is, where P1 and P2 are the a priori and a posteriori probabilities, respectively. It can also be concluded from this that Shannon's information quantity is "something used to eliminate random uncertainty" and "the eliminated uncertainty." According to the above discussion, Shannon's formula of information entropy H � −ΣP i Log P i can be interpreted in multiple senses.
It can be a measure of the randomness of the message sent by the source, a measure of the a priori uncertainty of the message generated by the source, a measure of the ability Complexity of a source to send information, a measure of the uncertainty of choosing among multiple messages, a measure of the average information quantity (average eliminated uncertainty) carried by each message, or a measure of the average degree to which information sink uncertainty is changed. ese meanings can be roughly divided into two categories: either relative to the characteristics of the information source or relative to the characteristics of the information source changing the state of the information sink. If it is aimed at the characteristics of the information source itself, the information entropy formula can be regarded as a measure of the entropy value of the information source itself. is is what Shannon called "the entropy of the information source," and the "measure of uncertainty" of the information generated by the source. If it is aimed at the characteristics of the information source changing the state of the information sink, it is an inference made by some later scholars based on the possible nature of Shannon's information quantity: Shannon's information quantity is "something to eliminate random uncertainty" and "is eliminated uncertainty." According to the explanation of the latter, this kind of information quantity is no longer "entropy" but has the meaning and value of "negative entropy" which is opposite to "entropy" and eliminates "entropy" or eliminates uncertainty. It is from this starting point that we can assert that although Shannon's "entropy of information sources" theory measures the uncertainty of information generated by information sources, this theory has paved the way for related theories using negative entropy to explain the information.

Schrödinger's "Life Feeds on Negative Entropy".
In the field of general science, the scientist who first proposed the concept of negative entropy in a relationship corresponding to the concept of entropy is Erwin Schrödinger, a well-known Austrian physicist and one of the founders of quantum mechanics [13]. In 1944, he wrote in What is life? the famous saying that "life feeds on negative entropy" and considered negative entropy: "entropy with negative sign is a measure of order." He wrote: "How does the living organism avoid decline to equilibrium? Obviously, it is by eating, drinking, breathing and (plant) assimilation. e special term is "metabolism" ...., meaning change or exchange. Here comes a question-what to exchange? At first, it was undoubtedly referring to the exchange of matter .... But it is absurd to think that it's essentially the exchange of matter. Any atom of nitrogen, oxygen, sulfur, etc., in the organism is the same as the atom of the same kind in the environment. What advantages can be brought to exchange them? Later, some people said that we live on energy ... In fact, it is ridiculous because the energy contained in an adult organism is fixed, just like the matter it contains. Since the value of one calorie in the body is the same as one calorie outside the body, it is really hard to understand the usefulness of pure exchange. "What precious things in our food can save us from death?" is is easy to answer ... A living organism is constantly generating entropy (or it can be said that it is increasing positive entropy) and gradually approaching the dangerous state of maximum entropy, namely death. e only way to get rid of death and to live is to continuously draw negative entropy from the environment. We will soon understand that negative entropy is very positive. Organisms live on negative entropy. Or to be clearer, the essence of metabolism is to enable an organism to successfully eliminate all the entropy that it has to generate when it is alive. "'Life feeds on negative entropy,' just as a living organism attract a string of negative entropy to offset the increase in entropy it generates in life, so as to maintains itself at a stable and low entropy level." e way an organism stabilizes itself at a highly ordered level (equivalent to a fairly low level of entropy) is indeed to continuously draw order from the surrounding environment ... In fact, as far as higher animals are concerned, it is a fact that people have known for a long time that they live entirely in the order of absorption. Because in the organism they take as food for of varying degrees of complexity, the state of matter is extremely orderly. After consuming these foods, animals excrete the greatly degraded things" ( [14] (pages 69-70, 72)). From these statements by Schrödinger, we have realized very clearly that organisms do not devour food, moisture, and air for the purpose of obtaining matter and energy. What the organism really needs to absorb from the environment are "negative entropy," "order," "orderliness," "organization," and so on. e concepts of "negative entropy," "order," "orderliness," and "organization" discussed in the communication and control theory and some related theories that have been developed later are interlinked with the functional interpretation of "information" in the most general sense. In this regard, "life feeds on negative entropy" can be interpreted as "life feeds on information." e negative entropy theory of life proposed by Schrödinger actually opened up a research direction of information theory, which is to study the entropy change of the system when it is open instead of just making an effort under the condition of isolated systems like the second law of thermodynamics. e openness of the system to the environment and the fact that the system and the environment maintain a certain degree of exchange of matter, energy, and information are the basic conditions on which all negative entropy theories are established.
Starting from Schrödinger's work, "negative entropy" acquired the nature of opposing to the concept of "entropy." If entropy describes the degree of disorder, chaos, and uncertainty of a system, then negative entropy describes the degree of order, organization, and certainty. From the perspective of the functionality of relative effect, negative entropy is the elimination of entropy and the elimination of uncertainty. From this, we can more clearly grasp and understand the basic perspective and nature of a series of related formulations and interpretations of the concept of information made in the subsequent development of information science.

Wiener's "Information Is Negative Entropy".
Almost at the same time as Shannon founded his theory of communication information entropy, Wiener, an American mathematician and the founder of cybernetics, also proposed the theory of information negative entropy in the process of establishing cybernetics by integrating the theory of communication and automatic control. In his book "Cybernetics" [2] published in 1948, he independently presented Wiener's formula, which is only one minus sign away from Shannon's information quantity formula. He wrote: " e information quantity is the negative number of the logarithm of a quantity that can be regarded as a probability, which is essentially negative entropy" ( [15] (pages 11, 65)). From this, we can also reasonably answer the question, why do Wiener's information formula and Shannon's formula differ by a negative sign? It is because the former measures "negative entropy," while the latter measures "entropy." Perhaps, the analysis from the perspective of the differences in cognitive methods can help us find the root that causes the difference between the information quantity of Shannon and that of Wiener ([16] (pages 33-34)).
We know that, in the field of mechanical communication, the number of primitives sent of the messages by a source and the probability of sending each message are predetermined and the information sink is fully aware of this determination. e a priori estimation by the sink of the uncertainty of the message sent by the source is also derived from this predetermination. In this way, the uncertainty of what kind of message the source sends can be considered both as a feature of the estimation of the information source state by the sink and as a feature of the source itself. e difference of minus sign between Shannon's and Wiener's information quantity formulas can be regarded as the result of their examination from these two different perspectives. e information quantity of communication can be deduced and calculated according to the principle of relativity of interaction and mutual stipulation between the source and sink. is leads to the stipulation of "a priori probability" and "a posteriori probability." If the information quantity formula is deduced from the perspective of the state characteristics of the source itself according to the principle of Shannon, then the contribution of the prior probability to the information quantity is reversed, because it provides the "uncertainty" of the source estimated by the sink, and its direction is opposite to the direction of the source's own characteristics. e posterior probability contributes positively to the information quantity, because it provides the information state itself of the source that actually occurs at the moment, and its direction is consistent with the direction of the source's own state characteristics. e expression in logarithmic function is If, like Wiener, the information formula is derived from the perspective of the understanding of the source by the sink, then the contribution of the prior probability to the information quantity is positive. On the contrary, the contribution of the posterior probability to the information quantity is reversed. us, in Wiener's formula, the formula of information quantity will be a minus sign different from the Shannon formula: It is indicated in the fact that the information quantity formulas of Shannon and Wiener can be deduced from the two opposite angles and directions of mutual interactions and determination of the information source and sink that the difference of a negative sign between these two formulas has a profound root in epistemology. is reflects the difference and unity of philosophical ontology and epistemological methods to a certain extent and significance. Regrettably, this has not been clearly noticed in the past.
It should be said that Wiener's thinking is the same as Schrödinger's. Schrödinger's negative entropy of life is used to calculate the ability to resist the spontaneous entropy increase in the living body, while he information quantity of Wiener is used to calculate the amount of new knowledge brought to the receiver by the message. Both have two basic points in common: ① the system is open and ② it can eliminate its own chaos by the environment. Here, what Wiener's information quantity calculates is exactly what Schrödinger's negative entropy calculates. It is no wonder that Wiener has repeatedly emphasized the idea that the information quantity is negative entropy. Again, we see that the crux of the problem lies not in the names of the concepts used but in the kind of problems that these concepts are used to explore.

Negative Entropy Flow of Prigogine.
Under the circumstances that some theories such as entropy, information, and negative entropy have been applied and developed in more and more discipline theories, the classical thermodynamics, which takes entropy theory and entropy increase principle as its basic characteristics, is also developing constantly.
is development finally broke through the limitations brought by the basic characteristics of classical thermodynamics to itself.
e Brussels school, represented by Belgian physicist and chemist Prigogine [17], reunderstood the second law of thermodynamics based on a series of experiments and proposed the famous negative entropy theory of dissipative structure theory in the 1960s [18]. It pointed out that the principle of increase of entropy only holds in isolated systems. For an open system, two factors, which are the external entropy flow caused by the exchange between the system and the environment and the entropy generation within the system, must be considered. Based on this, Prigogine proposed a generalized second law of thermodynamics, which is applicable to both open systems and isolated systems.
Prigogine pointed out that the entropy change of a system is caused by two factors. One factor is the entropy exchanged between the system and the environment during the interaction (deS, external entropy flow), and the other is Complexity 5 the entropy generated spontaneously within the system (dis, internal entropy change). For an isolated system, since there is no exchange of matter and energy between the system and the environment, it is impossible to have the exchange of entropy. erefore, in an isolated system, deS � 0, so dS � dis ≥ 0. It is the second law of thermodynamics (narrow sense) proposed by Clausius. For an open system, there is an exchange of entropy at the same time because of the exchange of matter and energy between the system and the environment. erefore, in an open system, the total entropy change of the system will show a complex scenario. When the external entropy flow is negative and the absolute value of the external entropy flow is greater than the absolute value of the internal entropy change, the system will move towards order along the direction of entropy decreasing. It can be said that Clausius's second law of thermodynamics is just a special case of the generalized second law of thermodynamics in an isolated system.
It is the generalized second law of thermodynamics proposed by Prigogine that reveals the inevitability of an orderly evolution of the system along the direction of entropy decreasing under a suitable and open background. In dissipative structure theory, the system introduces negative entropy flow from the outside to resist the increase of internal entropy, which is completely consistent with the basic ideas of Schrödinger's "negative entropy theory of life" and Wiener's "negative entropy theory of information." However, dissipative structure theory has extended the function scope of negative entropy into general physical and chemical systems. e essence of this expansion is to bring the entropy, negative entropy, and information theories into the all-embracing objective world, since every system follows the general laws of physics and chemistry.

Philosophical Interpretation of the Significance and Nature of Information Entropy and Information Negative Entropy Theories
In the analysis of related traditional literature, entropy and negative entropy are two concepts that correspond to each other with opposite meanings. However, if we study them further, then we will see that the two concepts have the same meaning and mutually formulated properties. Generally speaking, the concept of entropy is a measure of the degree of uncertainty of the fabric mode of the micro state of the system. It can reveal the degree of disorder of the system organization from a specific level and angle and from a specific level of quantitative measurement. Boltzmann's statistical physical entropy and Shannon's "entropy of information sources" are all established in this sense.
As for the concept of negative entropy, it can be formulated in two different senses in related general theories: one is the degree to which the organization mode of its structure deviates from the standard value (the maximum entropy value) relative to the same system, and the other is the degree to which the entropy value of a system decreases in the process of the change of system organization mode If a formal description of the fabric mode of a system is needed, two quantities are needed: one is the number of possible micro states of the system, and the other is the probability that each micro state may occur. If A � {a 1 , a 2 , . . ., a n } is a set that represents the possible number of micro states of the system and P � {p 1 , p 2 , . . ., p n } is a set that represents the probability of possible occurrence of each micro state, then the organization mode of the formal structure of the system (M) can be expressed by a matrix as follows: M � A P � a 1 a 2 a 3 · · · a n p 1 p 2 p 3 · · · p n .
(4) e organization mode of the system's structure described by this matrix may be in two extreme circumstances: one is the state of maximum entropy; in this case, p 1 � p 2 � . . . � p n � 1/n, and s max � log n; and the other is the state of minimum entropy; in this case, p 1 � 1, p 2 � p 3 � . . . � p n � 0, and s � log1 � 0.
If we determine that the case of s max is the standard value to which the organization mode of the system's structure should be referenced, then all cases where the system entropy is less than s max can be regarded as a deviation from this standard value. What causes this deviation? To what extent is this deviation? Obviously, there should be a concept to specify this factor, and there should be a calculation to measure the extent of this deviation. A very natural idea is that the effect of this factor is the opposite of entropy, which is negative entropy. is calculation should be the difference between the standard entropy value and the actual entropy value. Based on this idea, we can get the following calculation formula for negative entropy ( [19] (pages 67-74)): negative entropy � s max − s.
Obviously, there are two extreme circumstances of this formula: one is that when s � s max , the negative entropy value of the system is 0; the other is that when s � 0, the negative entropy value of the system is maximum, which is equal to s max .
Negative entropy not only can be specified in the sense that the entropy value in the organization mode of a specific system deviates from the standard value (the maximum entropy) but also can be specified in the sense that the entropy value decreases in the process of the change of the organization mode of the system. Schrödinger's negative entropy theory of life, Wiener's negative entropy theory of information, Prigogine's negative entropy theory of dissipative structure, and so forth, in essence, are all defined in the sense of entropy decrease. Negative entropy as a measure of the entropy decrease degree and entropy (Clausius entropy) as a measure of entropy increase degree are not measures of the system's absolute negative entropy or entropy value, but they are measures of some kind of quantity 6 Complexity "change" or "transformation," which is a measure of relative quantity. No matter it is Schrödinger's "life feeds on negative entropy," Wiener's "how much new information is given to us by hindsight," or the factors that resist the spontaneous entropy increase in the system by Prigogine are all developed from the perspective of relative functions that lead to changes of organization mode (degree of uncertainty) of system. Just as the entropy increase effect does not simply depend on how much heat is absorbed by the system but also on how much the relative degree of change that the heat brings to the organization mode of the system's original structure is, the entropy decrease effect also does not simply depend on what kind of message the system receives or what kind of mass or energy with a certain value of entropy or negative entropy the system absorbs but also on how the relative degree of change (degree of uncertainty) that the message, the mass, or energy brings to the organization mode of the original structure of the system is. is brings up a very interesting phenomenon. e same mass or energy or the same message, which acts on systems of different structural states, will play very different roles for different receivers, lead to entropy increase or entropy decrease, add new information, cause ideological disorder, or not work (maintaining the original structural mode unchanged and maintaining the original cognitive state unchanged). is is why the Clausius entropy increase formula has 1/T as the integral factor, and Wiener's information formula has a prior probability as the reference factor. Although concepts such as entropy, negative entropy, information, entropy increase, and entropy decrease often have very different specific meanings in different disciplines and for different scientists, these concepts are essentially consistent in nature because they all study the same kind of phenomena in a unified sense, and the differences between them emerge when the same kinds of concepts are applied to the research of different directions of the same kind of phenomena.
It is reasonable to distinguish the formulations of these concepts into two categories: one is a formulation given in a static sense and the other is a formulation given in a dynamic sense. In this way, we can clearly see that the ambiguous interpretation of these concepts is often caused by the confusion of these two types of formulations.
In essence, Boltzmann's statistical entropy, Shannon's information entropy, and the negative entropy indicating the degree of deviation from the standard entropy value of the system pointed out above are all the quantitative formulations of entropy (information entropy) and negative entropy (information) in the sense of static state and absolute value of the system. e basic meaning of this formulation is to calculate the degree of indeterminacy (uncertainty) of the micro state of the certain system and the extent to which this degree of indeterminacy (uncertainty) deviates from the maximum possible degree of indeterminacy (uncertainty). is point can be clearly seen from the previous comparative interpretation of the statistical entropy formula and the information entropy formula, as well as in our explanation of "negative entropy � s max − s." Some texts believe that the statistical entropy formula calculates the "entropy (change) of the system in a certain process," while Shannon's information quantity calculates "the information quantity (change) of the system in a certain process" ( [20] (pages 20-27)), and this statement is incorrect. Here we also want to emphasize one point; that is, whether it is Boltzmann's statistical entropy, Shannon's information entropy, or the negative entropy indicating the degree of deviation from the standard value of the system, they are still just a quantitative concept, and none of them can precisely define the general nature of the abstract meaning of entropy, negative entropy, and information. In terms of methodology, the definition of the abstract general nature of such concepts is not a task of these specific sciences but only a philosophical subject. If concepts are used accurately, we should replace them with such concepts of quantity of entropy, negative entropy, or information. e dynamic formulations for the concepts of entropy and information are developed in two directions: one is the direction of entropy increase based on the second law of thermodynamics, and the other is the direction of entropy decrease with the framework of various negative entropy theories constructed in the sense of resisting the entropy increase of the system. e very interesting fact is that the research on the dynamics of entropy and information was earlier than the research on its statics. Clausius was already quantifying its changes, when people did not really understand what entropy was.
Various forms of negative entropy theory are dynamic measures of changes in information (entropy) from another direction opposite to the second law of thermodynamics. Schrödinger, Wiener, and Prigogine all have a common idea that the system can input the external entropy (information) flow from the environment to resist the increasing trend of entropy within the system. ey measure the amount of external entropy (information) flow by the amount of change in entropy (information) within the system caused by the external entropy (information) flow. Because the external entropy (information) flow may cause the entropy decrease effect within the system, consequently, the quantity of this external entropy (information) flow can be measured by the degree of the entropy decrease effect within the system caused by it. Moreover, it is simultaneously and relatively defined as negative entropy.
It seems that the function ΣP i Log P i has some unique value. In the static state, its absolute value indicates the uncertainty of the micro state in which the system is located. Dynamically, the change of the function value indicates the change of the uncertainty of the micro state in which the system is located. is change is caused by a change in the value of n that indicates the number of micro states of the system and a change in the P i probability distribution. In general, the increase in the value of n and the tendency towards equilibrium of P i value result in the process of entropy increase, while the decrease in the value of n and the tendency towards differentiation of P i value result in the process of entropy decrease. As for the general idea that information and entropy are regarded as opposites, it is more like an artificial formulation. e statement that the Complexity 7 entropy decrease effect of the system is caused by the external information is equivalent to the statement that the system's entropy decrease effect is caused by the entropy flow introduced by the external environment. Prigogine uses external entropy flow, Schrödinger uses negative entropy, and Wiener uses both information and negative entropy; in fact, they are measuring the quantity of the same type of change in the same process. We have every reason to regard the various theories of entropy, information, and negative entropy as theories about entropy quantity, and, at the same time, we have all the reasons to regard them as theories about information quantity. Based on this, we can establish a kind of generalized entropy theory or a generalized information theory to unify the discussions of the quantity of entropy and information that have been carried out and are ongoing in different disciplines.

Comments and Conclusions
At this point, we are able to evaluate and summarize the nature of the relevant information entropy or information negative entropy theories. Firstly, the information entropy or negative entropy theories are the relative measurement for the structuralization degree at a specific level of the system, which has certain characteristics of relativity. e information entropy theory measures the relationship between the diversity of structural organization and the degree of uncertainty difference at a specific level of the system, while the negative entropy theory of information measures the diversity of structural organization methods and the degree of uncertainty reduction at a specific level of the system.
Secondly, we notice that, in the general theory, the concept of information is defined and explained in a special sense as negative entropy. ese are two related statements in communication and control theory: "information is the eliminated uncertainty" and "information is negative entropy." However, these two statements only emphasize the role of information to the receiver, which is the functional definition of information from a specific perspective. is definition does not reveal what the information is. At most, it only emphasizes from a specific perspective the role of information to the receiver. It is impossible to reveal the nature of information from such an interpretation.
Moreover, the information entropy or negative entropy theories are only to measure the state of a certain aspect of the system and the degree of state change in that aspect by a certain calculation method of amount. In this regard, the information entropy or negative entropy theories have the property of specifically defined quantitative characteristic.
In terms of such characteristics of relativity, functionality, and quantification, the theory of information entropy or negative entropy is only a technical quantitative processing method for mechanical communication and controlling processes in essence, not a theory about the nature of information.
It is necessary to mention here that, as early as 1928, Hartley (1890Hartley ( -1970, an American communications expert, pointed out in the article "Transmission of Information" that "Information refers to the message with new content and knowledge" ( [21] (page 535)).
is is also an acquired definition of information recognized and expressed by people in their daily life and in general literature. Obviously, this definition is in line with the meanings of "information is the eliminated uncertainty" and "information is negative entropy" mentioned earlier and it is formulated in the sense of whether the message can bring new content to the receiver. Obviously, such a definition is also relative and functional and cannot be used as an essential explanation of what information is.
Usually people always regard the "information is negative entropy" as the standard definition of information by Wiener and interpret the general nature of information from this. However, they did not seriously conduct discrimination and analysis but extended the explanation made by Wiener only in the sense of the quantitative description of the relative functionalization of technical processing to the general universal scope at will. In fact, Wiener's statement on "information is negative entropy" is just a practical interpretation of communication and control information from the perspective of technical processing by using the existing calculation methods of entropy and is only a kind of measure of the amount of practical information. What he seeks is only a method of quantitative processing realized by technology but not to reveal the general nature of information at all. By the same token, the statement that "information is the eliminated uncertainty" focuses only on a quantitative processing method realized by technology. As some scholars pointed out in the interdisciplinary research on information, "Wiener's types of mathematical definitions of information related to mathematical or physical concepts of negative entropy cannot adequately encompass the experiential embodied pragmatic semantic meaningful content of ordinary sign games of living systems and the language games of embodied conscious humans" ( [22] (pages 622-633)).
In fact, Wiener himself is very clear in what sense his "information is negative entropy" is used, because when he put forward this statement, he also made a corresponding discussion on the general nature of information. He has two influential formulations: One is "information is information, not matter or energy. No materialism which does not admit this can survive in the present day" ( [15] (page 133)). e second is "information is the name of the content that we exchange with the external world in the process of adapting to it and making this adaptation felt by the external world" ( [23] (page 4)).
Obviously, the first sentence of Wiener emphasizes the ontological status of information. Although he failed to correctly define the nature of information from the positive side in this sentence, he correctly emphasized the independent value and significance of information compared with matter (quality) and energy, and he also put forward a warning about the relevant materialism theory that failed to make a reasonable interpretation of the ontological status of information.
Wiener's second sentence further emphasizes the need to clarify the general nature of information. Instead of simply focusing on the form of the information carrier or the 8 Complexity function of the information, we should grasp the information based on what we "exchange with the outside world." Since it is "exchange," there should be in and out. In this way, there is information not only within our subject but also in the external environment. In this regard, the corresponding doctrines of objective information and subjective information should be valid. is also shows the true charm of the saying that "information is information, not matter or energy," which Wiener emphasizes. It is regrettable that, for a long time, Wiener's clear warning to philosophy has not attracted the attention of more philosophers and scientists. Not only has the revolutionary value of information for the development of philosophy not been clearly revealed, but also unified information science has not been established, because the establishment of unified information science must be based on the general theory of information philosophy.
In addition, we should also note that the statements "information is the eliminated uncertainty" and "information is negative entropy" are also single-faceted in the sense of functional definition. Because, in the real world, the role of information is multifaceted and multilayered, it can not only eliminate uncertainty but also increase uncertainty; it can play the role of negative entropy, as well as the role of entropy. For example, when a person is sick, he should take medicine to eliminate the disorder caused by the disease in his body, but what happens if he takes the wrong medicine? Obviously, the medicine will provide him with the corresponding information, but this information does not always play a role in eliminating uncertainty or negative entropy. In some cases, it may play the opposite role, which is to increase uncertainty or entropy.
An ancient Chinese literature, "Stratagems of the Warring States, Qin Stratagem II," tells a parable of "terrifying rumor." It was said that Zeng Zi's mother was weaving at home, and a neighbor came to tell her that "Zeng Zi has killed someone." Zeng Zi's mother did not believe and said, "I know my son, he will not kill people." She continued to weave calmly. After a while, another neighbor came to tell her that "Zeng Zi has killed someone." Zeng Zi's mother still did not believe it and said, "He won't kill anyone" and continued to weave. However, when the third neighbor came to tell her "Zeng Zi has killed someone," finally, Zeng Zi's mother could not sit still, and she put down her work and fled across the wall ...
In this parable, what effect does the information that "Zeng Zi has killed someone" have on his mother? Is it entropy or negative entropy? Is it entropy increase or decrease? Is uncertainty increased or eliminated?
Also, if we generalize the functional definition of "information is the eliminated uncertainty," then we will see some very ridiculous scenarios. In a book published as early as 1987, Wu once wrote " e role of information is fundamentally different from what information itself is. e nature of information can only be sought from the inner basis of its own content, but cannot be formulated simply by its effect on a certain aspect of the sink. Just as the definition of food cannot be 'eliminated hunger,' the definition of information cannot be 'eliminated uncertainty'" ([24] (page 8)).
Finally, here comes the most essential aspect that should be emphasized, that is, the relationship of structural differences at specific levels of the system measured only from the aspect of the form, which does not aim at the information itself but merely aims at the structural characteristics of the information carrier itself. Because of this, it is impossible to deduce the general nature of information directly from such a theory. It is no wonder that some western scholars have clearly and reasonably pointed out that "Information theory deals with the carrier of information, symbols and signals, not information itself" and "Information theory does not deal with the information itself but the carrier of the information" ( [25] (page 150)).
Since the calculation of the quantity of entropy and negative entropy involves the probability distribution of the micro states of the system being measured, it is reasonable that relevant viewpoints such as the degree of orderly or disorderly organization (order) of the system, "degree of variation," "differences and constraints," "symmetry breaking," "difference that makes a difference," "form and structure," and "state of things" are directly derived from the theory of entropy and negative entropy. Since related views such as these are directly deduced from the theories about the quantity of entropy and negative entropy, it is also impossible to obtain the formulation of the general nature of information through them.
Obviously, to reveal the essence of information, we should not just focus on the differential relationship of the carrier forms, but we must understand the contents of relevant properties, characteristics, existing modes, and states of the things itself presented by the information.
In an article published as early as 1986, Wu wrote the following sentences: "information is the formulation of something itself displayed in another that alienated by something itself, it is the indirect existence of something itself, which exist in other things. . . Information is the formulation of something revealed in the relationship between something and other things. Something is information when it displays itself as internal formulation in an external relationship, which is expressed in the form of externalization of the characteristics of the object" ( [26] (page 19)).
Based on the content of information and the dynamic mechanism of natural occurrence of information, Wu once clearly defined information as follows: "Information is a philosophical category indicating indirect being. It is the self-manifestation of the existing mode and status of matter (direct being)" in a paper entitled "Philosophical Classification of Information Forms," which was published in 1984 ( [27] (page 33)). In 2019, Wu expanded the definition of information that was only restricted to the level of philosophical ontology based on the historical evolution of information forms classified by him: "Information is a philosophical category indicating indirect being. It is the self-manifestation and re-manifestation of the existing mode and status of matter (direct being), as well as the subjective grasp and creation of information by the subject of cognition and practice, including the cultural world that has been created" ( [28] (page 143)).
Complexity e relevant discussion in this paper was not to negate the great success of entropy and negative entropy theories in physics, communication and information science and technology, artificial intelligence science and technology, life science technology, and other related science and technology fields.
e main purpose of the article was to reveal the specific properties of the entropy and negative entropy theories.
at is, what those theories reveal are only the quantitative formulations of the static or dynamic relative difference in the formal structure of the information carrier. Such a provision does not involve the essence of the information itself.
is scenario also stipulates many comparative interpretations of the nature of information based on entropy and negative entropy theories, which are also impossible to guide us to truly grasp and understand the nature of information. In addition, from the perspective of methodology, entropy and negative entropy theories focus only on the relationship between the material structures of the information carrier; the method used is still that of dealing with material phenomena and relationships. Although the corresponding material structure processing method is still technically feasible and successful, since the material relationships between information and its carrier structure are corresponding to each other, it is necessary to emphasize that since the theories and methods of entropy and negative entropy are not directly concerning the existence mode of information itself, as well as the meaning and value of information, to truly reveal the nature of information and the fundamental difference between it and material phenomena, we need to find another way, which is the research level and research method based on a more comprehensive and general meta science or meta philosophy and focusing on the existence mode of information itself and its meaning and value.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare no conflicts of interest.