Computational complexity is examined using the principle of increasing entropy. To consider computation as a physical process from an initial instance to the final acceptance is motivated because information requires physical representations and because many natural processes complete in nondeterministic polynomial time (

Currently it is unclear whether every problem whose solution can be efficiently checked by a computer can also be efficiently solved by a computer [

It appears, although it has not been proven, that the traveling salesman problem [

In this study insight to the

The recent formulation of the 2nd law as an equation of motion based on statistical mechanics of open systems has rationalized diverse evolutionary courses that result in skewed distributions whose cumulative curves are open-form integrals [

The adopted physical perspective on computation is consistent with the standpoint that no information exists without its physical representation [

According to the 2nd law of thermodynamics a computational circuit, just as any other physical system, evolves by diminishing energy density differences within the system and relative to its surroundings. The consumption of free energy [

Computation is, according to the principle of increasing entropy, a probable physical process. The sequence of computational steps will begin when an energy density difference, representing an input, appears at the interface between the computational system and its surroundings. Thus, the input by its physical representation places the automaton at the initial state of computation, that is, physically speaking evolution. A specific input string of alphabetic symbols is represented to the circuit by a particular physical influx, for example, as a train of voltages. Importantly no instance is without physical realization.

The algorithmic execution is an irreversible thermalization process where the energy absorbed at the input interface will begin to disperse within the circuit. Eventually, after a series of dissipative transformations from one state to another, more probable one, the computational system arrives at a thermodynamic steady state, the final acceptance, by emitting an output, for example, writing a solution on a tape. No solution can be produced without physical representation. Although it may seem secondary, the condition of termination must ultimately be the physical free energy minimum state, otherwise, there would still be free energy that would drive the computational processes further.

Physically speaking, the most effective problem solving is about finding the path of least action, which is equivalent to the maximal energy transduction from the initial instance down along the most voluminous gradients of energy to the final acceptance. However, the path for the optimal conductance, that is, for the most rapid reduction of free energy, is tricky to find in a circuit with three or more degrees of freedom because flows (currents) and forces (voltages) are inseparable. In contrast, when the process has no additional degrees of freedom in dissipation, the minimal resistance path corresponding to the solution can be found in a deterministic manner.

In the general case the computational path is intractable because the state space keeps changing due to the search itself. A particular decision to move from the present state to another depends on the past decisions and will also affect accessible states in the future. For example, when the traveling salesman decides for the next destination, the decision will depend on the past path, except at the very end, when there are no choices but to return home. The path is directed because revisits are not allowed (or eventually restricted by costs). This class, referred to as

Computation is considered as a dissipative process. The input as an influx of energy disperses from the input interface (top) through the network that evolves during the computation, according to the 2nd law of thermodynamics by dissipative transitions that acquire high (blue) and yield low (red) density in energy, toward the stationary state (bottom). Reversible transitions, that is, conserved currents (purple), do not bring about changes of state and do not advance the computation. Driving forces (free energy between the nodes) and flows (between the nodes) are inseparable when there are additional degrees of freedom (

In the special case the computational path is tractable as decisions are independent of computational history. For example, when searching for the shortest path through a network, the entire invariant state space is, at least in principle, visible from the initial instance, that is, the problem is deterministic. A decision at any node is independent of traversed paths. This class, referred to as

Finally, it is of interest to note the particular case when a particular physical system has no mechanisms to proceed from a state to any other by transforming absorbed quanta to any emission. Since dispersion relations of physical systems will be revealed first when interacting with them [

The physical portrayal of problem processing according to the principle of increasing entropy is based on the hierarchical and holistic formalism [

According to the self-similar formulation of energy transduction the nodes of network are themselves networks. Any two densities

Each node of a transduction network is a physical entity associated with energy

The computational system is processing from one state to another, more probable one, when energy is flowing down along gradients through the network from one node to another with concurrent dissipation to the surroundings. For example, a

It is convenient to measure the state space of computation by associating each _{B}T

According to the scale-independent formalism the network is a system in the same way as its constituent nodes are systems themselves. Any two networks, just as any two nodes, are distinguishable from each other when there is some influx sequence of energy so that exactly one of the two systems is transforming. In computational terms, any two states of a finite automaton are distinguishable when there is some input string so that exactly one of the two transition functions is accepting [

In the general case the calculation of measure

Conversely in the special case, when the reduction of a difference does not affect other differences, that is, there are no additional degrees of freedom, the changes in occupancies remain tractable. The conservation of energy requires that, when there are only two degrees of freedom, the flow from one node will inevitably arrive exclusively at the other node. Therefore, it is not necessary to explore all these integrable paths to their very ends. Then the outcome can be predicted and the particular path in question can be found efficiently. Moreover, when there are no differences

The overall transduction processes, both intractable and tractable direct toward more probable states, that is,

The physical portrayal of computational complexity reveals that it is the noninvariant, evolving state space of class

When computation is described as a probable physical process, the additive logarithmic probability measure

The definition of entropy

During the computational process the state space accessible by

The free energy minimum partition

In general the trajectories of natural processes cannot be solved analytically because the flows

According to the maximum entropy production principle [

In the special case when the currents are separable from the driving forces, the energy transduction network will remain invariant. In terms of physics the Hamiltonian system has invariants of motion and Liouville’s theorem is satisfied. The deterministic computation as a tractable energy transduction process will solve the problem in question because the dissipative steps are without additional degrees of freedom. The conceivable courses can be integrated (predicted). Hence the solution can be obtained efficiently, for example, by an algorithm that follows the steepest descent and does not waste time in wandering along paths that can be predicted to be futile.

Further insight to the distinction between computations in the classes

The continuum equation of motion corresponding to (

The curved energy landscape, covered by triangles, represents the state set of intractable computation. The non-Euclidian manifold is evolving by the contraction process itself toward the optimal path of maximal conduction (red arrows) corresponding to the solution. During the contraction the path with additional degrees of freedom (exemplified at a branching point) from the initial instance (top) toward the final acceptance (bottom) is shortening but remains nonintegrable (unpredictable) due to the dissipation. In contrast the paths (blue arrows) on the invariant Euclidean plane (grey) do not mold the landscape and thus they do not have to be followed to their ends but can be integrated (predicted).

The equation for the flows of energy can also be obtained from the familiar Newton’s 2nd law [

A particular flow

Finally, when all density differences have vanished, the manifold has flattened to the stationary state (

According to the geometric description of computational processes, the flattening (evolving) non-Euclidean landscape represents the state space of the class

The argument for the failure to map the larger

The transduction path between two nodes can be represented by only one edge, hence there are

For example, the problem of maximizing the shortest path by two or more interdicts (

The

In summary, the class

The computational complexity classification to

The class

The class _{1}, _{ss}_{1}, _{ss}

In the general case when the forces are inseparable from the flows, the execution time by the DFA array grows super-polynomial as function of the input length

A circuit (O) containing nodes with degrees of freedom (

The class

Since the class

To measure the difference between the classes

To maintain a connection to practicalities, it is worth noting that tractable problems are often idealizations of intractable natural processes. For example, when determining the shortest path for a long-haul trucker to take through a network of cities to the destination, it is implicitly assumed that, when the computed optimal path is actually taken, the traffic itself would not congest the current and cause a need for rerouting and finding a new, best possible route under the changing circumstances.

The state space of a finite energy system is represented by elements

A system is a pair (Φ

if

A process of (

when the system has transformed from the state

when the system has transformed from

when the system has transformed from the initial state

(a) The system evolves, according to Definitions

Let

When (

Define Λ to be the set of functions

The step of evolution along the oriented and piecewise smooth curve from

After a series of successive applications of

A family Σ of subsets of the state space Φ is an algebra, if it has the following properties:

the algebra Σ is closed under countable intersections and subtraction of sets,

if

A function

if

if

An energy density manifold is a set

the range of

for every

for every

Entropy is defined as

The change in occupancy

The condition of stationary state for the open system is that its entropy reaches the maximum.

From Definitions

The proof is in agreement with

The state space Φ contracts in dissipative transformations.

As a consequence of Definition

When entropy

The definition for the class

The definition for the class

The

One has

It follows from Definitions

The difference between the classes can also be measured by

The class

The network representing the class

Venn diagram for the computational complexity classes

At first sight it may appear strange for some that the distinction between the computational complexity class

The natural law may well be the invaluable ingredient to rationalize the distinction between the computational complexity classes

Furthermore, the crossing from class

The practical value of computational complexity classification by the natural law of the maximal energy dispersal is that no deterministic algorithm can be found that would complete the class

The author is grateful to Mahesh Karnani, Heikki Suhonen, and Alessio Zibellini for valuable corrections and instructive comments.