Introducing Randomness into First-Order and Second-Order Deterministic Differential Equations

We incorporate randomness into deterministic theories and compare analytically and numerically some well-known stochastic theories: the Liouville process, the Ornstein-Uhlenbeck process, and a process that is Gaussian and exponentially time correlated Ornstein-Uhlenbeck noise . Different methods of achieving the marginal densities for correlated and uncorrelated noise are discussed. Analytical results are presented for a deterministic linear friction force and a stochastic force that is uncorrelated or exponentially correlated.


Introduction
Stochastic theories model systems which develop in time and space in accordance with probabilistic laws. 1 Essential in stochastic theories is how randomness is accounted for.For Markov 1 processes 2 , which are an important class of stochastic processes, the state value of X t Δt at time t Δt is given by the state value at time t, plus a state value of a "random variable" at time t. 3 A random "disturbance" in a Markov process may possibly influence all subsequent values of the realization.The influence may decrease rapidly as the time point moves into the future.Five methods to account for randomness are 1 to assume deterministic equations to determine the stochastic process for say a particle and apply Monte Carlo simulations or analytical methods 2, 3 to find probability distributions, 2 to use the Liouville equation with or without added terms for the probability density per se, for example, for a particle, 3 to use ordinary differential equations for the statistical moments of the probability distribution, 4 to use the "hydrodynamic approach," which is to specify constitutive relations in an equation set akin to what is used in hydrodynamic formulations of of the so-called Liouville equation 4 .But a traditional picture of applying realistic randomness is through collision integrals that fulfill local conservation laws of mass, momentum, and energy 20, 21 .It is usually considered in direct reference to the Boltzmann kinetic theory.Microscopic collision rules are established based on time symmetry, but then a rapid decay of correlations amounts to assuming friction 22, 23 .For a second-order process driven by dichotomous noise with exponential correlation Masoliver 15 found a third-order partial equation for the joint density distribution.For a second-order process driven by Gaussian noise with exponential correlation Hienrichs 16 found a Fokker-Planck equation with time variable coefficients for the joint distribution.In the phase-space formulation of quantum physics the Wigner quasijoint distribution is commonly used 24 .The equation for the joint distribution includes additional terms compared to the bidimensional Liouville equation.See the review articles by Hillery et al. 25 and Lee 26 for a review of quantum phase-space distributions.No corresponding second-order stochastic differential equation is constructed.
A system with time-uncorrelated noise is usually just a coarse-grained version of a more fundamental microscopic model with time correlation.It is therefore of interest to study models with correlation.Bag et al. 18 introduced correlation by increasing the number of differential equations and applying uncorrelated noise throughout.This approach obviously increases the system complexity.This article shows that time-correlated noise can be mimicked by time-uncorrelated noise and time-dependent noise without increasing the number of equations in the equation set.To provide a benchmark we start in Section 2 by considering a one-dimensional system based on recurrence relations without correlation.We first show how the recurrence relation, which is usually applied for Gaussian noise only that means corresponding to the Ito integral can be used to develop equations for a more general noise that is multifractal.We develop the equation for variances and covariances.Gaussian processes give that higher-order moments follow from second-order moments.Our first order system has not been analyzed in the manner proposed in this paper, which is needed to develop alternative accounts of introducing correlation.We compare different methods of achieving the main equations and study when the time-dependent uncorrelated noise can mimic exponential correlated noise.Section 3 studies first order systems with correlation, which are compared with the systems without correlation in Section 2. Such comparison has not been made in the earlier literature.Section 4 considers second-order stochastic processes driven by time-uncorrelated or correlated noise.Proceeding to the second-order allows capturing a larger fraction of real life processes which makes the approach more realistic.We show that a second-order system driven by an exponential time-correlated noise force can be mimicked by adding time-uncorrelated noise both to the position and to the velocity 5 .Instead of expanding the equation set, noise is added separately to each of the two dimensions exemplified with position and velocity of the two-dimensional system.Section 5 concludes.

A First-Order Time-Uncorrelated Process with Additive Noise
This section provides a first order process which accounts for randomness.Assume that the differential equation Ẋ t f X t has been used to describe a physical phenomenon.Assume that this equation is found to no longer hold due to results from a more developed experimental set-up.The question is then as follows: how should this equation be a modified or b reinterpreted, to be more realistic?Assume that we use b , assuming that the original equation is reinterpreted to mean ∂E X t /∂t E f X t , where E means expectation of a stochastic process.Another possibility is E Ẋ t E f X t .These two interpretations are in general different since this one demands a derivative path.The next question is then how to construct a stochastic theory such that expectation, variance, and higher-order moments can be calculated.Stochastic or nonstochastic integrals can rarely be solved in analytic form, making numerical algorithms an important tool.Assume that dX t is the change in this quantity during a small time interval from t to t h we will let the time spacing approach zero .Assume that this change is proportional with the time interval dt h and with f X t , which gives the recurrence relation: where "mod" means that this is a model assumption and "def" means definition.When constructing a more developed theory accounting for noise, we can, for instance, make the initial values stochastic as in the Liouville approach or we can change the recurrence relation in 2.1 .In the study of nonlinear recurrence relations Glass and Mackey 27 showed that it is possible to construct an infinite number of deterministic relations which are chaotic, but which describe a given density distribution.Thus a given density distribution has no unique recurrence relation.It appears that the broad class of Markovian theories incorporating the Gaussian white noise input provides a satisfactory approximation for a large variety of phenomena.We consider the more general stochastic equation, which we consider as a stochastic differential equation with additive noise of the Ito form: Conceptually, we can easily generate realizations by applying 2.2a -2.2c on a computer.Assume that we perform M which approaches infinity different runs up to time t k , with a constant time spacing h which approaches zero .Let H x, t be the arbitrary function that we apply to each number.Thus we achieve the set of numbers, to read Applying Taylor expansion, the expectation E is achieved when M goes to infinity, that is, where the "dot" above a variable means the time derivative ∂/∂t, that is, Ḣ ∂H/∂t, and "D" means the space derivative D ∂/∂X t , that is, DH X t , t ∂H X t , t /∂X t .We can conceptually collect all tracks that pass through a given X t .We next assume that a the variance is E d * ς 2 dσ 2 mod g 2 t, X t dt g 2 t, X t h for small time steps, where g i t, X t , i 2, 3, 4, . . ., is some well-behaved function to be specified exogenously, and b that higher-order moments also have the same powers of h, akin to multifractal phenomena 28 .
The terms E DH X t , t dς and E 1/2! D 2 H X t , t 2f X t hdς equal zero due to 2.2c .
Thus for n > 1, dσ n E d * ς n mod g n t, X t h, ⇒ σn g n t, X t .This gives

2.5
A f is the forward infinitesimal operator of the process.By setting H X, t X as a special case, we easily find that ∂E X t /∂t E f X t .We introduce the probability density ρ t, x of X t .We choose H time independent and can then write 2.5 by definition as

Advances in Mathematical Physics
Only natural boundary conditions are used in this article and the space integration limits are suppressed.Integrating 2.6 by parts and assuming natural boundary conditions gives Equation 2.7 is valid for all H x , and thus

2.8
In fact, if our uncorrelated random term is Gaussian, all odd moments are zero, and even moments of higher order than two are of higher order than h see Appendix A .This gives σ3 t, x g 3 t, x σ4 t, x g 4 t, x σ5 t, x g 5 t, x • • • 0, implying the partial differential equation called the forward Fokker-Planck equation or forward Kolmogorov equation 29 .We can from 2.8 easily calculate the derivative of the variance of X t which for a time-dependent and uncorrelated random term with σ2 mod g 2 t becomes

2.9
The general Liouville solution of 2.8 is easily found, to read

2.10
Advances in Mathematical Physics

7
A is an arbitrary function such that the integral is equal to 1. Say that the force is a linear friction force f x mod −ax and that σ2 mod ξ 2 0 1−Exp −t/τ , where ξ 0 and τ are parameters.The time derivative of the variance becomes according to 2.9

2.11
Say that we will formulate a continuous equation in time that corresponds to where δ is the Dirac delta function that accounts for the lack of correlation.This gives that

2.13
Advances in Mathematical Physics Equation 2.13 is equal to 2.9 if 2E t 0 f X u duξ t 0, which we will find is correct if f is a linear force.As another example, set that the continuous process in space of X t is in fact an approximation to a discrete process in space.Assume as an example that X t ≈ N t is the number of cells that die randomly.We find that the drift term first order term of the Fokker-Planck equation 2. 8

2.16
We have used that the covariance of the initial value X 0 and the noise is zero, to read 2.17 We use our example λ t mod 1 − Exp −t/τ we will compare this process with a correlated process in the next section .This gives from 2.17 when t > t

2.18
The variance follows for t t actually t t demands a more careful analysis, but it turns out that setting t t in 2.18 is correct , to read

2.19
The time derivative of the variance becomes

2.20
Thus 2.19 gives the same solution as in 2.20 since 2Cov f X t , X t −2a Var X t for our linear friction force.The expectation is given by E X t E X 0 e −at .For Gaussian processes correlated or uncorrelated the variance is important since higher-order moments follow from second-order moments variance .By comparing with 2.13 and 2.20 we see that 2E t 0 f X u duξ t 0 for a linear deterministic force f.However, this can be found more easily for a linear force by using 2.14 since Exp av σ2 t z δ z dz du 0.

2.21
Advances in Mathematical Physics 11 We further find from 2.18 for the special case that λ 1 that Thus in this special case X t is for large times exponentially correlated.We can write 2.8 as where v t, x is a current velocity, which can be arbitrary in a general theory.Generally, we can let the increment depend on the probability density at time t, to read

2.24
Realizations are easily generated by a computer.All realizations have to be calculated in parallel such that the density can be "counted up" at each time t before a new time step is calculated.This process is not Markovian.

First-Order Stochastic Processes with Exponential Correlation
In Section 2 we analyzed first order uncorrelated processes with additive noise.The Fokker-Plank equation was applicable for the uncorrelated processes.It turns out that in some cases uncorrelated processes with additive noise can in fact mimic correlated processes.
Assume as an example that the random term is exponentially correlated, with correlation , where τ is a correlation time parameter.In the limit when τ approaches zero we achieve uncorrelated noise, to read Lim Notice that an equation for ξ t that in fact will generate this exponentially correlated noise for large times see 2.16 follows simply by a scaling of the parameters in 2.14 , to read where ψ t is white noise with E ψ t ψ t τ 2 ξ 2 0 δ t − t .

Advances in Mathematical Physics
The last line in 2.16 is general.For exponentially correlated noise with linear friction we achieve

3.3
Advances in Mathematical Physics 13 The variance follows when t t , to read Equations 3.4d and 3.4e show that when t τ, we achieve the uncorrelated noise with Hurst exponent one half, while when t τ, we achieve a fractional noise with Hurst exponent one.For the concept of fractional Brownian motion see Peitgen et al. 30, section 9.5, page 491 .See also M. Rypdal and K. Rypdal 31,32 for fractional Brownian motion models descriptive for avalanching systems.We can compare the two different stochastic processes, the one with linear friction and uncorrelated time-dependent noise where E ξ t ξ t ξ 2 0 λ t δ t − t σ2 t δ t − t , and the one with linear friction and exponentially correlated noise of the type E ξ t ξ t ξ 2 0 / 2τ Exp −|t − t |/τ .The covariances and variances are given by the solution 2.18 -2.19 and 3.2 -3.4a , 3.4b 3.4c , 3.4d , 3.4e , and 3.4f , respectively.In the limit when a approaches zero, the variances become equal, but the covariances remain unequal, to read , a 0 : correlated.

Advances in Mathematical Physics
Indeed, the variances can be calculated easily when a 0, to read without correlation

3.6
With exponential correlation we achieve that

3.7
We can calculate higher-order moments for the two stochastic processes.For the special case that the time-dependent uncorrelated process is Gaussian, a Fokker-Plank equation follows as shown in Section 2 with σ2 ξ 2 0 1 − Exp −t/τ .Heinrichs 16 has proved that a Fokker-Planck equation also follows for the probability density when assuming Gaussian  exponentially correlated noise called Ornstein-Uhlenbeck noise, see Appendix A .Thus for a 0, the probability density of the time-dependent Gaussian uncorrelated noise is equal to probability density of the exponentially correlated Gaussian noise.Gaussian processes give that higher-order moments follow from the second-order moment.Thus we should expect equality of two Gaussian processes, even a correlated and uncorrelated one, if the variances are equal.
With friction we can also compare the variances, to read

Advances in Mathematical Physics
We observe that the variances are different.Thus the two stochastic processes couple differently to the linear friction term.Assume that aτ 1.The variance then becomes , aτ 1 : correlated.

3.9
In this limit the variances are not different.Thus pure exponentially time-correlated noise can be mimicked by uncorrelated noise.However, the noise couples differently to a linear deterministic friction force.

3.11
Assuming Gaussian noise, the expectation becomes

3.12
Here we have used the algebra in A.5 -A.7 .However, it has in the literature been considered nice to have a process that fulfills the relation ∂/∂t E X t E f X t , t , even for multiplicative noise.It turns out that this can be archived if the time derivative path is abandoned see Appendix C for the Ito calculus .
We can mimic Gaussian-correlated noise by Gaussian-uncorrelated noise when using the Stratonovich integral for multiplicative noise also.To achieve this we must according to 3.11 have that However, we have proven this already in 3.6 and 3.7 .More generally assume that the correlated noise is chosen to be E ξ u ξ v C uc u, v, τ .A solution is σ2 t, τ 2 t 0 C uc u, t, τ du, which is fulfilled for the two processes we examine in this article.When using the Ito integral in 3.10 , the solution is different.Appendix C shows that the solution is E X t E X 0 Exp −at .This solution for the uncorrelated case cannot mimic the correlated case.

Second-Order Stochastic Processes
Unfortunately bidimensional first order processes or second-order stochastic processes are more difficult to address than one dimensional first order processes.This is so because the position at a given time depends strongly on the velocity.Removing this dependence is tricky.We construct a stochastic interpretation of the bidimensional equation set with additive noise, to read as the Ito stochastic equation where the expectations are zero and dσ 2X and dσ 2Y are the variances.We achieve by Taylor expansion of an arbitrary function H x, y that For time-uncorrelated Gaussian random terms with no cross correlation only the terms with d * ς Y 2 , d * ς x 2 contribute to order h.Thus after some simple algebra analogous to the algebra in Section 2 we achieve the well-known Fokker-Planck equation: x, y , σ2Y g 2Y t, x, y .

4.3
In physics or engineering applications second-order differential equations are often used as models.For physical systems where we use f ma, we set the mass of the object equal to m 1.The second-order differential equation can be written as bidimensional first order equations, to read

4.4
The Langevin model for the Ornstein-Uhlenbeck 1930 process with uncorrelated Gaussian random force is a special case of 4.1 -4.3 when assuming a random term in the velocity Y t equation only due to a stochastic force , that is, ξ X 0, g 2X 0. As a well-known example, assume that f X t , Y t −V X t , −εY t , g 2Y X t , Y t 2εkT in the Ornstein-Uhlenbeck 1930 process.Thus we have a random force, a conservative nonrandom force, and a linear nonrandom friction force.k is the Boltzmann constant and T is the temperature.The Fokker-Planck equation, corresponding to an uncorrelated Gaussian random force, becomes according to 4.4 It is easily verified, and well known, that a steady-state solution is given by the Boltzmann distribution, to read where c B is a constant and T is the temperature.Thus the Boltzmann distribution is achieved as a steady-state solution when assuming linear friction.Notice that when ε 0, every solution of the type ρ t, x, y ρ H is a steady-state solution.H is now the Hamiltonian, to read H 1/2 y 2 V x .This shows the importance of linear friction to achieve the correct steady-state solution for the uncorrelated Gaussian process.
Consider now the second-order process in 4.1 with only a random force.We can find the analogous continuous time solution equation for the position, to read according to The question is now whether uncorrelated noise can mimic correlated noise for the position, and not only for the velocity.Assume that we first calculate the variance by only applying an uncorrelated force of our now familiar type

4.8
However, for the correlated process with E ξ Y t ξ Y t ξ 2 0 / 2τ Exp −|t − t |/τ , we achieve that 4.9 Thus we find that the variances of the position are unequal since Var X t

4.10
Thus for t τ we find a Brownian motion with Hurst exponent 1.5, while for t τ we find the Brownian motion with Hurst exponent 2. This is in agreement with the asymptotic solution found by Heinrichs 16 .

Advances in Mathematical Physics 21
The equation for the joint distribution is more complicated to derive.We achieve by Taylor expansion with only a random force that

4.11
We further achieve from 4.11

4.16
This is in disagreement with 4.15 .This shows that our time-dependent uncorrelated noise cannot mimic the correlated noise for position.To check this further we have for correlated noise that 4.17 This is in agreement with 4.15 .However, it is easily observed from 4.1 that 4.14 follows if

4.18
We thus find that noise must be added both to the position and to the velocity in the Ito stochastic differential equation to mimic the correlated noise force.In addition noise in velocity and position must be cross correlated.The time continuous uncorrelated Stratonovich version but with cross correlation in velocity and position that will mimic the solution in 4.14 is accordingly

Advances in Mathematical Physics
Agreement can also be shown by studying the variance directly, to read

4.20
This agrees with the solution in 4.9 .More generally, according to method 2 discussed in the introduction, we can introduce noise simply by adding terms more or less as hoc to the Liouville equation.Expanding on the results of 4.18 with a random and nonrandom force, consider the equation where the force is both a noise term plus a deterministic force, to follow from 4.18 when

4.23
The solution for the variance of Y t is equivalent to 3.8a substituting X → Y, which assumes time-dependent uncorrelated noise.The solution is not equal to the correlated solution in 3.8c .Thus 4.19 or 4.21 and Ẍ t f Y X t , Y t ξ Y t in general model different physical realities, even though the models are the same when f Y is set to zero.Also both models are equal in the limit where the correlation time approaches zero.
By integrating with respect to the position and velocity, respectively, we achieve the equation for the marginal density of velocity and position, to read

Advances in Mathematical Physics
The same equations apply for the Liouville process.We can find an explicit relation for v X t, x : 4.25

Conclusion
This article studies the construction of stochastic theories from deterministic theories based on ordinary differential equations.We incorporate randomness into deterministic theories and compare analytically and numerically some well-known stochastic theories: the Liouville process, the Ornstein-Uhlenbeck process, and a process that is Gaussian and exponentially correlated Ornstein-Uhlenbeck noise .Different methods of achieving the marginal densities are discussed and we find an equation for the marginal density of velocity and position for a second-order process driven by an exponentially correlated Gaussian random force.We show that in some situations a noise process with exponential correlation can be mimicked by time-dependent uncorrelated noise.We show that a second-order system driven by an exponential time-correlated noise force can be mimicked by adding time-uncorrelated noise both to the position and to the velocity.For such a situation the traditional concept of force loses its meaning.

A. Gaussian Exponentially Correlated Noise and the Fokker Planck Equation
Following Heinrichs 16 ,  The expectation and variance become A.3 This gives that A.4 Or alternatively A.5 Now, by assuming a Gaussian distribution all odd moments become zero.The even moments are given by

Advances in Mathematical Physics
The characteristic function for the density ρ X t, x of X t becomes which implies A.8 Thus we find that Hence with correlation time τ we achieve a Fokker-Planck or forward Kolmogorov equation 29 with an explicit time variation in the diffusion coefficient.A solution is

B. A Discrete Markov Process
We apply a specific Markov process in continuous time and discrete space.Fixing attention to a cell, we define S t,t h stochastically varying as a number k indicating whether this cell does not die j 0 or dies j 1 during the time interval from t to t h.It is reasonable to assume that the conditional point probabilities that this cell does not die, or dies, between t and t h are where N t n is given at time t, f n is an arbitrary function, and f n means the absolute value.Equation B.2 applies to each cell i.e., all cells are "alike" .This gives a recurrence relation: The conditional point probability is where applying the law of total probability gives

B.5 Advances in Mathematical Physics
Rearranging and taking the limit as h tends to zero yields Proceeding with this Markov process, the right-hand side in B.6 can be approximated as a continuous partial differential equation.Taylor expansion up to order 2 gives For large n, Δn 1 is a good approximation to the space step length, where Δ is an arbitrarily small real increment without denomination.Inserting into B.6 gives Assuming p N t n 0 for n > n max , where n max is any arbitrary positive integer, and inserting B.6 into the definition of ĖM N t gives Applying as an example the special initial condition

C. Ito Calculus for Multiplicative Noise
We do not in this section use the " * " to separate the Stratonovich from the Ito interpretation.Say that we have two solutions of the differential equation as One must decide whether the Langevin equation C.1a should be solved by Ito or by Stratonovich integrals.Different numerical schemes for numerical solutions of stochastic differential equations i.e., Langevin models driven by Gaussian white noise do not in general give the same solution.It can be shown that the Euler scheme is generally consistent with the Ito formulation since during trapezoid integration the leftmost points are used for the intervals that are summed during integration thus not looking into the future! .We have the following numerical scheme in the Ito formulation C.1c :

C.6
Setting f −ax, χ x, σ2 σ2 t we find that Ė X t −aE X t 1/2 σ2 t E X t .This is in agreement with the solution in 3.12 .The Ito ν 0 and Stratonovich ν 1 "calculus" can be written for an arbitrary H .According to the development in 2.4 and 2.5 we have Thus C.8 In general the relation between the Ito integrals and the Stratonovich integral is C.9

D. A Simple Deterministic Noise as an Example
Say that the noise is given by where φ is a time parameter.Let the density be given by , when 0 ≤ φ ≤ 2π, 0, otherwise.

D.2 Advances in Mathematical Physics
This gives when we for simplicity set that a 1

E. Methods 3, 4, 5 for Introducing Randomness
A third method to introduce randomness is by ordinary differential equations for the statistical moments of the probability distribution.These moments can be constructed ad hoc or found by mathematical manipulation of the partial differential equation for the probability density, or simply from a recurrence relation in time.
A fourth method of introducing randomness is the hydrodynamic method.Consider the variables as position and velocity for illustration, but the method applies generally.By integrating the equation for the joint distribution for two stochastic variables with respect to the second variable velocity , the well-known equation for the conservation of probability in space is found.This equation, which is only the conservation of probability, can be used without referring to any stochastic theory.The equation includes the so-called current velocity.It is well known that in Boltzmann kinetic theory or in most Langevin models, the total derivative of the current velocity is equal to the classical force minus a term that is proportional to 1/ρ X x, t times the space derivative of Var Y t /X t ρ X x, t , where Var Y t /X t is the variance of Y t at time t given by the position of X t and ρ X x, t is the density of X t 20 .Now, the equation for the conservation of probability in space is a first partial differential equation.As a second equation, set the total derivative of the current velocity equal to the classical force minus a term that is proportional to 1/ρ X x, t times the space derivative of −ρ X x, t Var Y t /X t as in Boltzmann's kinetic theory or in Langevin models.Thus randomness can be accounted for by constitutive relations for Var Y t /X t without postulating a relation for a joint or quasijoint distribution 4 .For the Liouville process realizations in the position-velocity space phase space cannot cross.In addition, for a conservative classical force, all realizations that start at the same position will have a unique velocity at a given position when applying the Liouville process, which implies Var Y t /X t 0. The equation for the total derivative of the current velocity, which now equals the classical force, can be integrated in space to give the familiar Hamilton-Jacobi equation in classical mechanics as a special case.More generally, the total derivative of the current velocity of the Liouville and the Ornstein-Uhlenbeck 1930 processes assuming uncorrelated Gaussian noise has also been analyzed when assuming initial conditions in position and velocity that are independent and Gaussian distributed.It has been shown that Var Y t /X t is independent of x, but time-dependent for the free particle or for the harmonic oscillator 33-35 .We believe that this hydrodynamic method can be useful when experimental data pertain to the variables in the equation set, and there is no direct experimental access to microscopic dynamics.
The fifth method of introducing randomness is quantization rules or Nelson's 5 approach.Quantization rules "transform" second-order ordinary differential equations into stochastic equations, assuming that the system of ordinary differential equations follows from Lagrange's formalism.Hojman 36 and Gomeroff and Hojman 37 provide several examples of the construction of Hamilton structures without using any Lagrangian.In the double slit experiment interference appears and the received theory states that the density solution is impossible to construct from Markovian recurrence relations 38 .It turns out that the density solution is equal to the Liouville density solution, and realizations that follow deterministic trajectories can then be used to "count up" the density.Realizations can always be constructed by using the density solution to construct different realizations by drawing positions from that solution at each time step.realizations can also be constructed from Advances in Mathematical Physics a Liouville density, but they are certainly different from the classical tracks in that case.A mixture of classical stochastic theory and quantum mechanics has been developed by the use of a quasiclassical Langevin equation.The variables are nonoperator quantities while the spectrum of the random force relates to the zero point fluctuation that is proportional to frequency and the Planck spectrum 39, 40 .Nelson's 5, 41, 42 stochastic version of quantum mechanics is believed to be equivalent to the quantum mechanics for predictions of the outcomes of experimental experiments.Blanchard et al. 43 show that Nelson's approach is able to describe in a unified language the deterministic and random aspects of quantum evolution.The approach has features analogous to the Ornstein-Uhlenbeck 1930 theory.But the increment in position is not written as the classical velocity times the time step, as in the Ornstein-Uhlenbeck 1930 theory, but a general drift field that depends on time and position times the time step plus a random uncorrelated Gaussian term.In addition the drift is found by setting the average sum of the so-called backward and forward infinitesimal operator applied on the drift, equal to the classical force.

F. The Duffing Equation
The Duffing equation is a nonlinear second-order differential equation exemplifying a dynamical system exhibiting chaotic behavior, that is, in the simplest form without forcing F.1 Equations F.1 describe the motion of a damped oscillator with a more complicated potential than in simple harmonic motion.In physical terms, it models, for example, a spring pendulum whose spring's stiffness does not exactly obey Hooke's law.The Duffing equation cannot be solved exactly in terms of symbols, though for the special case of the undamped ε 0 Duffing equation, an exact solution can be obtained using Jacobi's elliptic functions.Numerically, Euler's method, Runge-Kutta's method, and various numeric methods can be used.
To demonstrate applying Duffing's method that the methods in this paper work, we propose three methods.First, replacing X t with x t and substituting V x into 4. 1− Exp −t/τ to follow.Given for one instant an intelligence which could comprehend all the forces by which nature is animated and the respective situation of the beings who compose it-an intelligence vast to submit this data to analysis-it would embrace in the same formulae the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes.The human mind offers, in the perfection which it has been able to give astronomy, a feeble idea of this intelligence.Its discoveries in mechanics and in geometry, added to that of universal gravity, have enabled it to comprehend in the analytical expressions the past and the future state of the system of the world".This shows that the Liouville equation supports Laplace's world view, that the future can be foreseen, and the past can be recovered to any desired accuracy by finding sufficiently precise initial data and finding sufficiently powerful laws of nature.In the early 1900s Poincaré supplemented this view by pointing out the possibility that very small differences in the initial conditions may produce large differences in the final phenomena.Poincaré further argued that the initial conditions always are uncertain.Poincaré 60 wrote on chaos: "A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we assume that the effect is due to chance.If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at the following moment.But, even if it were the case that the natural laws had no longer any secrets for us, we could still only know the initial conditions approximately.If that enabled us to predict the succeeding situation with the same approximation, that is all we require, and we should say that the phenomenon has been predicted, that is, governed by laws.But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena.A small error in the former will produce very great ones in the final phenomena.Predictions becomes impossible, and we have the fortuitous phenomenon".The marginal probability density can be found by integrating out other variables of the joint probability 61, 62 .
5. It quantum physics, the approach by Nelson's 5 approach is able to describe in a unified language the deterministic and random aspect of quantum evolution 43 .Some students believe that Newton's second law f ma is a law of nature a law that can be falsified by experiments , but actually it is only a definition of force.But the mathematical structure, f ma, suggests that the "force" is easy to find a new relation for.Thus the equation set becomes closed.It is not obvious that a mathematical theory accounting for randomness or noise is most easily formulated by seeking to model a random force.In this way Nelson's approach suggests that at least in quantum mechanics, the force is not so easy to find, and that the mathematical structure most easily can be built from mathematical principles different from Newton's second law.But in principle, the mathematical principle chosen to model randomness or noise is not given a priory.This applies both for the phenomena which we call quantum phenomena, and for the phenomena that we call classical stochastic phenomena where we usually seek a random or noise force .
ς 2 dσ 2 .E means expectation.H x, t is any arbitrary function.The expectation of d * ς t, h, X t in 2.2b is zero.We assume no correlation.Equation 2.2c is assumed in order to develop 2.5 .Equation2.2c is valid when using Ito calculus no bounded variation but is not valid when using Stratonovich calculus.When d * ς t, h, X t 0, we define, as determined by 2.2a -2.2c , what we call a Liouville recurrence relation, which defines the Liouville process, where only the initial values are stochastic.
h, X t , d * ς t, h, X t mod Random X Distribution 0, dσ 2 t, h, X t , * ς t, h, X t is not a differential in the Riemann sense and is therefore denoted by the " * ". σ 2 t, h, X t is a function and dσ 2 t, h, X t is the differential change of σ 2 t, h, X t with respect to h during the time step h.We model d * ς t, h, X t as a stochastic variable where Distribution is a distribution of expectation zero and variance E d * We let ξ t be deterministic the randomness is then only in the initial values of ξ t ; see Appendix E as an example .The deterministic approach generally gives that ξ t can be integrated in the traditional Riemann sense.The Stratonovich integral occurs as the limit of colored noise if the correlation time of the noise term approaches zero.A quite common and different integral is the Ito integral for the uncorrelated situation and Gaussian noise.This integral of ξ t cannot be integrated in the Riemann sense due to lack of bounded variation.However, for additive noise the Stratonovich and Ito models give the same answer.To match the noise in 2.2a -2.2c we set that expectation is zero and that the noise is uncorrelated, to read in the Stratonovich sense indeed, also in the Ito sense since we will achieve the same result for additive noise : is related to the diffusion term second-order term by g 2 f 2 The D 1 D 2 term is zero for τ 0 as it should be.Say that we calculate the expectation ∂/∂t E X Y .This gives when using 4.14 that 12D 1 D 2 H X t , Y t hY t dς Y is of order h 2 .This gives, as it should, the Fokker-Planck and when the correlation time τ approaches zero, we achieve the traditional Fokker-Planck equation corresponding to the Gaussian uncorrelated random force.ξ / 2τ Exp −|t−t |/τ .It will not.As our example we use the linear friction force, f Y −aY .The Stratonovich solution of 4.19 is By manipulating the equation set the Schr ödinger equation follows easily.The work by Nelson has stimulated more recent and refined studies on the stochastic approach.See Albeverio and Høegh-Krohn 44 , Ezawa et al. 45 , Albeverio et al. 46 , Guerra 47 , Carlen 48 , Zheng 49 , Zambrini 50 , Blaquière 51 , Garbaczewski 4 , Blanchard and Garbaczewski 52 , Garbaczewski and Olkiewicz 53 , Garbaczewski and Klauder 33 , Czopnik and Garbaczewski 34 , and Garbaczewski 35 .See also Bratteli and Robinson 54 for the statistical mechanics of continuous quantum systems.For a study of statistical interpretations of quantum mechanics see Ballentine 55 .
which is an analytical expression for the probability density as a function of t, x, and y.As expected, Lim x t → ∞ V x t ∞, Lim x t → ∞ ρ t,x, y 0, Lim y t → ∞ ρ t, x, y 0, and Lim x t → ∞,y t → 0 ρ t, x, y c B , Second, replacing X t with x t and replacing Y t with y t and substituting f Y X t , Y t −εY t − αX t − βX t 3 into 4.21 give ρ t, x, y −D 1 ρ t, x, y y − D 2 ρ t, x, y f Y x, y * ς t, h, X t : Stochastic variable λ t :