^{1}

^{1}

^{2}

^{3}

^{1}

^{3}

^{1}

^{2}

^{3}

Neurons are specialized, electrically excitable cells which use electrical to chemical signals to transmit and elaborate information. Understanding how the cooperation of a great many of neurons in a grid may modify and perhaps improve the information quality, in contrast to few neurons in isolation, is critical for the rational design of cell-materials interfaces for applications in regenerative medicine, tissue engineering, and personalized lab-on-a-chips. In the present paper, we couple an integrate-and-fire model with information theory variables to analyse the extent of information in a network of nerve cells. We provide an estimate of the information in the network in bits as a function of cell density and short-term depression time. In the model, neurons are connected through a Delaunay triangulation of not-intersecting edges; in doing so, the number of connecting synapses per neuron is approximately constant to reproduce the early time of network development in planar neural cell cultures. In simulations where the number of nodes is varied, we observe an optimal value of cell density for which information in the grid is maximized. In simulations in which the posttransmission latency time is varied, we observe that information increases as the latency time decreases and, for specific configurations of the grid, it is largely enhanced in a resonance effect.

Networks of nerve cells are complex systems in which a large number of components combine to yield collective phenomena with improved abilities in contrast to simple components of that system [^{12} neurons cluster in three-dimensional architectures. The unprecedented functions of human brain, including self-consciousness, language, and the development of memory, may depend less on the specialization of individual neurons and more on the fact that a large number of them interact in a complex network [

The exchange of information between individual neurons is mediated by a cascade of chemical to electrical signals which travel across the gap (synaptic cleft, approximately 20 nm wide) between those neurons [^{2+} channels. Ca^{2+} ions entering nerve terminals trigger the rapid release of vesicles containing neurotransmitter, which is ultimately detected by receptors on the postsynaptic cell [

Here, we revise the integrate-and-fire model in the version proposed by De La Rocha and Parga [

We consider

A random distribution of nodes in a plane (a); nodes are connected through a Delaunay triangulation, that guarantees that the number of edges varies linearly with the number of nodes (b); the resulting graph upon after removal of the internodal distances smaller than a cut-off distance, which is here 0.2 times the length of the lattice.

Each node in the network sends and receives information and this process is mediated through the integrate-and-fire model and (

Nodes are divided into firing, delivering a signal, and target, receiving a signal: a target neuron can receive current pulses from multiple sources; in turn a firing neuron can deliver current pulses to multiple targets (a); the sum of multiple stimuli in a neuron modifies the potential across its membrane until it surpasses a limiting or threshold value: in this circumstance, the neuron generates an action potential (b).

The temporal sequence of pulses or spikes which propagates along the grid encodes the information transmitted over that grid, which can be represented through the sole Shannon information entropy [

The entire grid is stimulated with a signal that can be random (a) or periodic (b). From a random long sample of stimuli, one may derive the total entropy of the spike train. Similarly, the noise entropy is the variability of the spike train in response to the sample of repeated stimuli. Information is as the difference between the total and the noise entropy.

Entropy is a measure of the variability of a signal. In each neural site, the signal is registered like a sequence of 0/1 binary events: signals are encoded in bits (a). The number of occurrences of a bit in a sequence is the probability that a bit is generated, from the probability; entropy may be derived (b).

Information is derived in each node of the network. Here, we present information as circles or spheres in which the diameter of the sphere is proportional to the total information conveyed through a node over time.

We consider a grid composed by

Figure

Information delivered through individual nodes can be integrated over a circumference of radius

The information transported at a distance

The overall information

The information integrated over the entire grid as a function of number of neurons

The position of the center of mass of information in the network as a function of number of neurons. It represent the distance at which information is transmitted. Depending on the effective links among neurons in a grid, there exists an optimal value of

Here, we present results of simulations in which the refractory time

Density of information in a grid of neurons as a function of the number of neurons in the grid, for different refractory times: the smaller the timing of spiking neurons, the larger the information neurons can convey in a grid (a). Position of the center of mass of information in a grid as a function of the number of neurons in the grid, for different refractory times: a combination of network topology and physical characteristics of individual neurons may exist, for which the efficiency of transport is giantly enhanced in a resonance effect (b).

The information that travels through the grid occupies successive sites of the grid in a discrete sequence of steps (a). The timing between successive steps yields the frequency of the grid in contrast to the spiking frequency of individual neurons (b). If the number of steps in a grid assumes an integer value, it implies that the frequency of the neuron is an integer multiple of the frequency of the grid. This generates cumulative amplification of the signal transmitted over the grid.

The presented results indicate that in a network of nerve cells the information transmitted over the network depends on the absolute number of cells in the grid and, for intermediate values of neural density, it reaches a maximum. The information is herein represented as the total information (information quantity), that is, the information integrated over all the nodes in the grid, and the density of information, that is, the total information divided per the number of nodes in a network. Moreover, we provide an estimate of the position of the center of mass of information in the net, that is, the distance over which it is transported and the larger the distance the larger the efficiency of the grid (quality of information).

Using mathematical modelling and computer simulations (in which an integrate-and-fire model is coupled to a discrete Shannon’s entropy based description of information in bits), we found that the quantity, density, and quality of information depend on the cooperation of neurons in a grid and on the topology of the network. While the information quantity increases as a monotonic function of the number of cells

These findings are in qualitative agreement with other described experiments. Writing on Plos One, Biffi and colleagues [

Our results deserve to be discussed even further. In the simulations, neurons in a plane are connected through a Delaunay triangulation of not-intersecting edges, which guarantees that the number of neurites per neuron is approximately constant and lower than

Consider a certain number of nerve cells seeded on a planar flat surface. Those cells shall be uniformly distributed and thus cell-cell distance would depend on the number of cells and cell density in the culture. If cell density is sufficiently large, internodal distance is small and neurons will develop connecting synapses. Under these conditions, our model indicates that any increase in cell density would adverse information. Thus, any increase in cell density would be prevented and cells would not proliferate nor migrate on the surface. This prediction confirms a number of experiments [

Membrane potential

Capacitance of the membrane

Conductance of the membrane

Resistance of the membrane

Time constant

Resting potential

Threshold potential

Stimulating current, which is a function of time

Action potential

Maximum number of neurons in the grid

Length of the grid

Set of neurons in the grid

Subset of

Number of edges of the Delaunay triangulation of

Degree of the graph

Adjacency matrix

Distance between

Cut-off distance

Intermodal distance

Current pulse

Timing of individual pulses

Number of neurotransmitter release events

Dirac delta function

Heaviside function

Attenuation

Refractory or resting time

Time bin

Total entropy

Noise entropy

Temporal window in which the signal is transmitted

Word, sequence of binary events in

Probability of a word

Information

Radial distance from the central node in a network

Information transmitted at a generic distance

Total information integrated over the entire grid

Information density

Center of mass of information

Amplification factor

Frequency

Number of steps in a grid.

The authors declare that they have no competing interests.

This work has been partially funded from the Italian Minister of Health (Project no. GR-2010-2320665).