Stochastic Nonlinear Equations Describing the Mesoscopic Voltage-Gated Ion Channels

. We propose a stochastic nonlinear system to model the gating activity coupled with the membrane potential for a typical neuron. It distinguishes two different levels: a macroscopic one, for the membrane potential, and a mesoscopic one, for the gating process through the movement of its voltage sensors. Such a nonlinear system can be handled to form a Hodgkin-Huxley-like model, which links those two levels unlike the original deterministic Hodgkin-Huxley model which is positioned at a macroscopic scale only. Also, we show that an interacting particle system can be used to approximate our model, which is an approximation technique similar to the jump Markov processes, used to approximate the original Hodgkin-Huxley model.


Introduction
In 1952, Hodgkin and Huxley proposed their famous model for the membrane potential dynamic of a typical neuron through observations made in the squid giant axon.Basically, the model is based on the mean behavior of the potassium (K + ) and sodium (Na + ) ion channels, whose states (open/closed) are determined by a gating process acting inside those channels.The proposed system is 4-dimensional nonlinear equations, fully coupled through the membrane potential (or voltage) and the probability that a representative gate/ion channel of each species is open (see [1]).As it describes deterministically the general membrane potential through such a mean field approach, it might be indicated as a macroscopic process with respect to all the processes involved in the internal neuronal process.
In order to explain the internal fluctuations or its channel noise in a neuron (see [2]), stochastic versions of the original Hodgkin-Huxley model have been suggested.A bunch of them are based on taking into account the underlying stochasticity in the gating activity, by describing the open/closed processes as jump voltage-coupled Markov processes and by using empirical measures instead of probability measures of the corresponding open times (see [3][4][5][6]).In a certain way, such methods are consistent because when the number of channels or gates goes towards infinity the original deterministic Hodgkin-Huxley equations are recovered (see [5,6]).
In this work, beyond considering the inherent stochasticity of such phenomena, we want to include a continuous dynamical model for the gating activity, because the discrete two-state (open/closed) point of view of its modeling is just an approximation of the corresponding metastable states of the continuous movement of the proteins involved.
The voltage sensors are proteins whose conformational positions are responsible for the states of the gating activity (see [7]).This internal process is known as the voltagegated ion channel process.A continuous state space stochastic process seems to be suitable to describe the position of such proteins, and it is what we are going to propose.Due to the nature and scale on which we will tackle this, we say that the dynamics representing such voltage sensors are located at a mesoscopic scale or level.Thus, our proposed system will consider two different levels, the macroscopic and the mesoscopic one, which are fully coupled through a link voltage-dependent function.We will call our model Hodgkin-Huxley-like model, which will conserve the main structural characteristics of the original voltage equation.Thus, placing 2 International Journal of Stochastic Analysis the gating phenomenon from a continuous point of view is basically the main motivation for conducting this work.
Our model is a stochastic nonlinear system, where its nonlinearity is due to the intervention of the probability law of the stochastic process in its dynamic (see [8][9][10][11][12]).In our case, such a probability will represent the probability that a representative voltage sensor of any species "pulls up" its corresponding gate, which is equivalent to the probability that the corresponding channel is open in the original or classical model.
The mathematical consistence of our proposal is evaluated by means of the approximation via a particle system, that is, a system with several interacting voltage sensors.It means that our model will meet with features similar to those of the original macroscopic case, with respect to the convergence of empirical measures to the corresponding probability law.In our case, this convergence property by using empirical measures from a stochastic particle system approximating our ideal system is known as propagation of chaos.
Specifically, our propagation of chaos result is as follows: consider the voltage equation depending on empirical measures instead of the probability that the (potassium, sodium) ion channels are in an open state, as in the original Hodgkin-Huxley equations.Suppose that those empirical measures depend on the behavior of certain continuous state space stochastic processes representing the interacting voltage sensors of typical (potassium, sodium) ion channels.Then, when the number of sensors (or channels) goes towards infinity, we will recover a Hodgkin-Huxley-like system which has two remarkable fully coupled components: the voltage equation depending on the probability laws of the corresponding voltage sensor positions (macroscopic part) and the stochastic equations for the position of those voltage sensors (mesoscopic part).
This result extends earlier ideas for the description of the ion channel dynamic: from a discrete space state of open/closed gates to a continuous state space of the position of the voltage sensors.Our extension makes the introduction of the propagation of chaos property necessary, which comes to replace the approximation via jump voltage-coupled Markov processes.
Another nice property of our model is the recurrence, which has a biological interpretation.Also, this work generalizes the ideas in [13].
In Section 2, an introduction of those stochastic nonlinear processes, our model, and its main features are set.
In Section 3, the connection of our model with the original Hodgkin-Huxley model is given, as well as its mathematical justification.
Conclusions are given in Section 4.

Our Model
2.1.Nonlinear Processes and Propagation of Chaos.We are going to introduce some basic facts about a certain class of nonlinear stochastic differential equations which are useful to describe situations where individuals (particles, components of a system, etc.) are interacting with each other through a mean field force.Let Y be a continuous-time R  -valued stochastic process defined on some complete probability space (Ω, F, ) satisfying where W  is a R  -Brownian motion and  > 0. This kind of systems has been widely studied under different forms of B and under extensions of the diffusion part (see, e.g., [8][9][10][11][12]14]).As example, ( 1) is a special case of the nonlinear system considered in [11] and the existence and uniqueness of a solution (Y, ]) of ( 1) are proved when Y 0 ∈  2 (i.e., ] 0 ∈ P  (R  ), where and Lipschitz condition on B (see Theorem 2.2 therein, which states the existence and uniqueness of a solution in law sense, although a stronger result is proved).That analysis is based on the contraction of the Wasserstein metric on P  (R  ), which is defined as where Π ], is the set of probability measures on R 2 such that the marginal law with respect to  is ] and the marginal law with respect to  is .The nice feature of this metric is that the metric space (P  (R  ), V  ) is complete and separable (see [15]).Here we say that (1) has a unique solution pathwise and in law, which for a usual Itô diffusion is analogous to saying that it has a unique strong solution.
The probability law of (1) follows a special parametrization of the McKean-Vlasov equation.This is a nonlinear equation whose general form is given by where M is a measure on R  and  is a test function with compact support and with derivatives of any order.In this case, it describes the Fokker-Planck equation for the temporal evolution of the law ] of (1) (also called Fokker-Planck McKean-Vlasov equation), where we have that Other interesting examples arising in possibly singular cases of B can be found in [10].
The law ] is considered to describe a mean field force.That is, two individuals from (1), Y 1 and Y 2 driven by independent and identically distributed (iid, for short) Brownian motions W 1 and W 2 , are "interacting" through its common law ].But it is an ideal situation because, probabilistically, they are not interacting (Y 1 and Y 2 are iid).It is a wellknown fact that, under some conditions, we can approximate the law of (1) on bounded intervals of time by the  interacting particle system where the Brownian motions W 1 , . . ., W  are iid.Here, the main result is as follows: for some  = 1, .[11]).This property is known as propagation of chaos, term attributed to Kac ([16]).
In this work and under the reference of the Hodgkin-Huxley model ( [1]), we are going to suggest that when the gating process in a typical potassium or sodium ion channel is seen as a continuous-space stochastic process describing the movement of a representative voltage sensor, the law of the opening times can be approximated via interacting particle systems as in (5).This comes to replace the classical view of the two-state jump voltage-coupled Markov process for the gating activity (see [3][4][5][6]).Due to the dimensional change, we say that the interacting particle system is located at a lower level than the general membrane potential.If the membrane potential is considered as a macroscopic process, which is natural since the voltage equation is describing the deterministic and general membrane potential through a mean field approach, the voltage sensor dynamic will be seen as a mesoscopic process (where for us, the ion dynamics are at microscopic scale).

The Membrane Potential and the Voltage-Gated Process
Seen as a Nonlinear System.To describe the coupled evolution of the membrane potential and the voltage-gated process through the position of the voltage sensors, we distinguish two different scales according to its nature: the first one follows a deterministic general dynamic (macroscopic), and the second one follows a stochastic local dynamic in a continuous state space (mesoscopic).The classical Hodgkin-Huxley equations explicitly describe the mean performance of the Na + and K + channels depending on the voltage evolution.The paradigm is that each channel contains 4 gates, which can be in one of the two states: open or closed.A channel is in an open state (conductance) if all its gates are open.Otherwise, the channel is in a closed state (nonconductance).
Such gates and its corresponding voltage sensors are proteins and together they form the so-called voltage-gated process.A K + channel has its 4 gates of the same type and a Na + channel has 3 of one type and 1 of another.Although all of this comes from the classical formulation of Hodgkin-Huxley model, as technology advances (see [17]), more precise descriptions about how those proteins operate have been given (see, e.g., [7]).
For our mathematical treatment, it is enough to consider only a simple system with 1 gate (voltage sensor) type

Gating current
Figure 1: A simple schematic view of the parts forming a voltagedependent ion channel (taken from [7]).So far, this is a way at which those proteins called voltage sensors apparently act to yield the gating mechanism (see [17]).
Figure 2: A symmetric double-well potential.The main idea is that one well is representing the open or "up" state of a typical voltage sensor and the other well is the closed or "down" state ("up" and "down" according to Figure 1).For our model, symmetry means the absence of a voltage-dependent force.Thus, this symmetry is broken (and possibly the existence of a double-well at all times) when an additive force is considered.
(extension arises naturally as we will see later).A simple scheme for the states of an ion channel is depicted in Figure 1.
The voltage sensors are responsible of the opening/closing times of its associated gates, therefore, the conformational state of those sensors determines the channel state.The movement of a representative voltage sensor can be seen as a double-well potential diffusion, where one well represents the open state and the other represents the closed state (see Figure 2).
A typical symmetric double-well potential as in Figure 2 is given by where () = − 2 /2 +  4 /4,  is a Brownian motion, and  > 0. (Just for simplicity, we chose the double-well potential centered at zero where its minimum is located at ±1.) To use this kind of process for the position of a voltage sensor, we need to add a continuous voltage-dependent force, say ℎ(), where  ∈  ⊂ R is the membrane potential or the voltage evolution.That function is the force responsible for the depth variation of the basins of : according to the values of , there are periods where the voltage sensor tends to open (or "to go up") and others where it tends to close (or "to go down").

International Journal of Stochastic Analysis
For example, when ℎ takes positive values the depth of the basin of the right-hand side (0, ∞) becomes deeper than the left-hand side one.Just the opposite occurs when ℎ takes negative values.Thus, the position of the voltage sensor will be described by the coupled process (, ), where  satisfies Note that the new potential is now given by U(, ℎ( To complete our model, we need to provide the dynamical behavior of .As the general voltage evolution of a neuron is usually modeled via a mean field approach of the gating activity, following the Hodgkin-Huxley paradigm we need to introduce the law of .Thus, our ideal complete system (, ) will be given by where  is an appropriate function describing the voltage evolution.Here, (8) combines two processes at different levels: the membrane potential or general voltage at a macroscopic level and the voltage sensor dynamic at a mesoscopic level.The function ℎ plays the role of connecting those two scales.In the next section we are going to relate (8) with the Hodgkin-Huxley model in a more precise way.Now, we are going to impose some assumptions to ensure the existence and uniqueness of a solution of (8).These are as follows.
Before showing that, under (A.1)-(A.3),( 8) has a unique solution pathwise and in law, we are going to show that (8) can be written as in (1).
The function  is locally Lipschitz since   is locally Lipschitz.Thus, the usual globally Lipschitz assumption for the existence and uniqueness of a solution of ( 11) is not satisfied.Nevertheless, we can use a stopping argument to ensure that.First, consider, for any where by (A.3) it is clear that  + ≥ 0 and  − ≤ 0.Then, ( 7) is equivalent to Therefore,  ∈ [ − ,  + ], where each interval limit solves with  ± 0 a.s.
=  0 and where the Brownian motions involved are indistinguishable from each other.The last two processes are well defined, as is shown in the following proposition.Proposition 1.Let  ∈ R,  > 0, and  0 ∈ P 2 (R).Let X be a real-valued stochastic process defined on some complete probability space (Ω, F, ) and assume that it is given by where () = − 2 /2 +  4 /4,  is a Brownian motion, and X 0 ∼  0 .Then, there exists a unique strong solution X for the above equation with second moment, (sup Clearly, the processes  ± can be identified with a special case of (14).
International Journal of Stochastic Analysis 5 Comment 1.In [18] (Chapter 2, Theorem 3.5) a stopping procedure based on a special growth condition, as the one displayed in the preceding proof, is used in order to demonstrate the existence and uniqueness of a strong solution.Nevertheless, we are going to use additionally another stopping argument which arises from condition (A.1) and regularity, which is property defined below and part of the recurrence property.
Recurrence of ( 14) is an interesting feature from a biological standpoint.It means that the process representing the voltage sensor is always operating, unless the heat transference between the cell and its environment or reservoir is lost ( → 0 in (11)).
(i) A process  is said to be regular if, for all (, (ii) Let O ∈ A(R), where A(R) is the Borel algebra on R. A process  is said to be recurrent relative to O (O-recurrence) if it is regular and for every (, ) ∈ where A process  is said to be recurrent if it is recurrent relative to any segment containing the origin.
Comment 2. This last definition has been inspired from [19].
Before showing the recurrence of ( 14), note that the Fokker-Planck equation associated with the law  of such a stochastic differential equation is given by whose stationary probability measure () is equal to where K = ∫ R exp{−2(()−)/ 2 }.By continuity, it will imply that the expected proportion of time that  spends in each well (if  < 2/(3 √ 3)) can be bounded by the stationary probability that  + and  − spend in those wells.
Proof.Note that the corresponding generator is given by Consider the function It is easily seen that LΨ = 0 and Ψ() → ±∞ when  → ±∞.To conclude, we recall Lemma 3.9 from [19].
Lemma 4. Suppose that a process  almost surely exits from each bounded domain in a finite time.Then a sufficient condition for O-recurrence is that there exists a nonnegative function V(, ) in the domain { > 0} × O  such that The existence comes from Proposition 1.Thus, the function Ψ(⋅) sign(⋅) satisfies the assumptions of the previous lemma for every segment containing the origin, and hence, proposition holds.
Mathematically speaking, one of the consequences of recurrence is that the solution of the corresponding system does not explode or escape towards infinity: if we define, for any  ∈ N and  ∈ R + ,    := inf{ ≥ 0 : X  ≥ } ∧ , then (∃ ⋆ ∈ N :   ⋆  = ) = 1.Indeed, if not, there exists has a positive probability, which contradicts regularity of X.Another consequence is that the corresponding solution is almost surely always visiting the segments containing the origin, which within our biological setting will imply that the voltage sensor described by (7) will never be stuck in one of the wells of .That is, the states "up" and "down" will always be visited, which is one of the characteristics of the two-state gating approach.
=  0 , consider { ∧  } ∈R + = {   } ∈R + =   and { W∧  } ∈R + = { W  } ∈R + = W .Then, according to Theorem 2.2 in [11] (or Theorem 1.1 in [10]), there exists a unique solution pathwise and in law to the equation: It can be easily verified that the voltage sensor represented by ( 7) is also recurrent since it is continuous and is in between two recurrent processes (called above  ± ).(8) and Some Subsequent Results.From a macroscopic perspective, we can find that in some situations it is possible to justify the presence of different time scales involved in a membrane potential/gating model, as in the Hodgkin-Huxley model (see, e.g., [20]) and in the FitzHugh-Nagumo model (see [21]).What could happen is that  works faster than  since they are positioned at different levels.In such a case, we may homogenize time scales.

A Short Remark about the Possible Existence of Different Time Scales Involved in
The way we will do this is based on an analogous procedure made for the FitzHugh-Nagumo model in [21].(Despite the fact that such a procedure is similar to that which we will use here, the graduation of the time scales is different because our mesoscopic component  is not considered therein.) Consider a small parameter, say  (0 <  ≪ 1), which acts in the following way: where  = (), and under the time-change   = X we get where   = ()/√.Dependence of  on  is due to time acting in the second moment of the Brownian motion.If  ≈ 0 we might expect that  ∫ R ( Ṽ, )() ≈ 0, and then, under the standpoint of the faster dynamic , ℎ() is near to being a constant (ℎ() varies slowly, to be more precise).Thus, this could tempt us to regard (7) as an equation of the form of ( 14) by freezing the activity of ℎ() at some point .Following this idea, let  be a solution of ( 14) with  ∈ (−2/(3 √ 3), 2/(3 √ 3)), that is, by considering that the doublewell structure holds.Replace  therein by   and consider  0 =  0 ∈ R. The corresponding deterministic solution (i.e., when   = 0) has three fixed points:  − () <   () <  + (), where  − () and  + () are stable points and   () is an unstable point.Define the quasi-potentials: Consider the basins of attraction  1 () = (−∞,   ()) and  2 () = (  (), ∞), and let  1 () and  2 () be the first exit times from  1 () and  2 (), respectively.If   = , then for any  > 0 we have if  0 ∈  1 (), and if  0 ∈  2 ().This is a result from [22] (Chapter 4, Theorem 4.2) in which the authors used large deviations to argue the classical Kramers' time for the expected time between transitions from one well to another in the small noise limit.It was also applied within the neuroscience context in the mentioned work [21] for the FitzHugh-Nagumo model, using a similar procedure but having a different time scale perspective.
In this remark we wanted to point out that when different time scales are involved, approximated methods can be used to obtain some interesting results, which can be difficult to obtain when such different time scales are not present or their existence cannot be justified.Indeed, more detailed results can be obtained for our model under this situation, but it would mean moving away from the main aims of this work.

Connection with the Hodgkin-Huxley Model
where  is the membrane capacitance and  (A/cm 2 ) represents the external stimulus which will be seen here as a given parameter;   ( = K, Na, ) represents the maximum conductance; and  K ,  Na , and   are the Nernst potassium, sodium, and leak equilibrium potentials.The leakage current   ( −   ) is an Ohmic current representing mostly the chloride (Cl − ) one.Processes , , and ℎ represent the corresponding probability that a representative gate is in an open state.As we mentioned before, in the case of potassium channels 4 gates are of the same type () and in the case of sodium channels there are 3 gates of the same type () and 1 of another type (ℎ) (do not be confused with the function ℎ(⋅) defined in (7)).The probability that a typical potassium and sodium ion channel is open is given by  4 and  3 ℎ, respectively, where the iid assumption for the behavior of each corresponding gate is adopted.The explicit values for the transition rates  type (),  type () as well as the typical values for the involved constants can be found in [23].
The master equations for , , and ℎ also suggest that each gate works according to the following transitional scheme: that is, a two-state jump voltage-coupled Markov regime.Because of independence between ionic species and according to the mathematical treatment that we want to do, it is enough to consider the law of one gate of any type and a generic function .Such a reduction is given by with  0 ∈  ⊂ R. System (34) together with the stochastic regime of the gating transitional scheme described above forms a particular example of a piecewise deterministic Markov process (PDMP), which corresponds to a general class of nondiffusion stochastic models introduced by [24] in 1984.Its construction can be easily described as follows (see [6,25] for a further development of such a regime in this context).For all ,   ∈ R + , let Φ(, V   ) be the solution of the voltage equation (34) at time   +  starting at    = V   .Let  1, be the first time that the gate gets closed and let  1, be the first time after  1, that the gate opens.Then, where in this case (  ).Analogously, let  2, be the first time after  1, that the gate gets closed.Then, where now (  ) and so on.All this leads us to the conclusion that the classical Hodgkin-Huxley model, together with its corresponding gating regimes, can be clearly identified with a PDMP.With this, we want to stress that in (34) the stochasticity of the gating regime is hidden, since only its probability law intervenes, but in our setting, the stochasticity of the gating regime is explicitly given.In order to relate our proposed model with some major characteristics of the classical Hodgkin-Huxley model, consider the following features of such a function  from (34): (a)  :  × [0, 1] → R is a continuous function such that for any  ∈ [0, 1] the equation is linear, and, for any We can identify such properties with those of the Hodgkin-Huxley model, where its generalization to three gate types (i.e., , , and ℎ types) is straightforward: Just consider  instead of , where  :  × [0, 1] 3 → R. We are going to identify those characteristics with our macroscopic/mesoscopic approach and show that, despite the scale reduction for the gating activity, it conserves a similar structure for the voltage equation as in the classical Hodgkin-Huxley model.
With all those ingredients, we say that our system (8) satisfies a Hodgkin-Huxley-like model.Here,  represents a cumulative distribution function such that ∫ ()() is the probability that a typical voltage sensor is in the open or "up" state (thus, it plays the role of ℘ in the macroscopic case (34)).As example, we have the Laplace cumulative distribution function: where  > 0. As  is increasing, it can be convenient to think that the open or "up" state is located at the right-hand side of the potential , that is, (0, ∞).Note that our proposal enlarges the state space of the dynamic of the gating activity: from a discrete two-state space dynamic to a continuous real state space dynamic.So, it is such enlargement which causes a change in the reference level: from the macroscopic 0-1 gating dynamic to the mesoscopic scale reduction regarding the continuous movement of the voltage sensor.
Comment 3. In [13] we suggested the function () = 1 (0,∞) (), which arises as a limit point of the degenerated case  ↓ 0 in the previous Laplace cumulative distribution function.So, this work extends the mentioned one by considering a class of smooth functions for picturing the probability that a typical voltage sensor is in the open or "up" state.
As we mentioned in Section 2.3, processes ( 8) and (34) are in an ideal situation due to the determinism of  and the absence of probabilistic interaction among all the voltage sensors.The more realistic situation is when (8) and (34) are just limiting cases, which is the topic that comes next.
In [5,6] the convergence (  ,   )   → (, ℘) uniformly on any bounded time horizon, where (, ℘) satisfies (34), is proved under smoothness and Lipschitz conditions on , , and , the boundedness of  on any finite time horizon and the uniformly boundedness of   in , on any finite time horizon.Thus, stochastic hybrid system (39) forms a PDMP where the jumps of   go to zero when  → ∞, thereby raising the paradigm that the Hodgkin-Huxley equations arise as a limit of infinity gates.
Further results were also obtained in [6].For example, a Central Limit Theorem for √ (  − ,   − ℘) T was provided, which allows us to justify mathematically the following diffusion approximation: where  is a Brownian motion.
In [13] we proposed the following voltage dynamic, in the ideal case: where  is a the voltage sensor dynamic and  = (0, ∞) represents the "open region." The problem is that it allows just the function () = 1 {∈} as integrand in the mean field approach, trying to emulate the classical 0-1 gating process, which can be recovered by a degenerate limit of a suitable smooth function (see the previous comment).Thus, in this work a variety of smooth functions  can be considered as integrand in the mean field approach in order to describe the probability of the open gate state, or the "up" voltage sensor state.Also, the smoothness of  sets our model in a more familiar structure, as pictured in (11), with respect to the nonlinear stochastic differential equations as (1).Under this nonideal setting, in [13] a propagation of chaos result was proposed, but just locally in time.In this work, we can develop a propagation of chaos result in a very classical way (see, e.g., Theorem 1.4 in [10] and Theorem 2.3 in [11]) due to the smoothness of , although an additional step is needed in order to overcome the problem that our drift part is locally Lipschitz and not globally, as assumed in the above references (see Theorem 7 below).
For our model, systems as (5) arise in order to approximate nonlinear equations as (11).Thus, our paradigm is that we can recover a Hodgkin-Huxley-like model when propagation of chaos holds.That is the main difference between the setting of PDMPs approximating the classical Hodgkin-Huxley equations and the proposed setting of an interacting particle system approximating a Hodgkin-Huxley-like, as defined by us in the previous subsection: propagation of chaos is the tool that we have to use in order to show consistency.
So, consider  gates and let {(  , where the Brownian motions   , . . .,   are iid.Here, each  , represents the voltage sensor associated with the th gate being dependent on each other through the interaction function ℎ(  ).Thus, propagation of chaos will ensure us that the voltage sensors become independent of each other, implying that all its gates become independent of each other as in the macroscopic case.Note that our approximating system is in  × R  and the ideal system is in  × R.That is because in the limit of infinity gates it is sufficient to consider only one representative ideal gate (voltage sensor) to describe the law .Before setting this fact, we have to show that our approximating system is well posed.Proof.In this proof we are going to proceed similarly as in Proposition 1 because our definition of recurrence was set for one dimension only.