^{1}

^{2}

^{3}

^{1}

^{2}

^{3}

From economic inequality and species diversity to power laws and the analysis of multiple trends and trajectories, diversity within systems is a major issue for science. Part of the challenge is measuring it. Shannon entropy

Statistical distributions play an important role in any branch of science that studies systems comprised of many similar or identical particles, objects, or actors, whether material or immaterial, human or nonhuman. One of the key features that determines the characteristics and range of potential behaviors of such systems is the degree and distribution of diversity, that is, the extent to which the components of the system occupy states with similar or different features.

As Page outlined in a series of inquiries [

At the outset, motivated by examples of measuring diversity in ecology and evolutionary biology from [

First, in terms of definitions, we follow the ecological literature, defining

In turn,

More specifically, as we will see later in the paper, we define the diversity of a probability distribution as the number of equivalent equiprobable types required to maintain the same amount of Shannon entropy

Recently, we have introduced a novel approach to representing diversity within statistical distributions [

After developing the concept and formalism for case-based entropy for discrete distributions [

In the following, we continue to explore the use of case-based entropy in comparing systems described by statistical distributions. However, we now go beyond our prior work in the following ways. First, we extend the formalism in order to compute case-based entropy for continuous as well as discrete distributions. Second, we broaden our focus from complexity/complex systems to diversity in

Third, the discrete indices we used had a degree of subjectivity to them, for example, how should household income be binned and what influence does that have on the distribution of diversity? As such, we wanted to see how well

Fourth, we had not emphasized how

Hence, the purpose of the current study, as a demonstration of the utility of

The quantity

First, it is well known that, on a finite measure space, the uniform distribution maximizes entropy: that is, the uniform distribution has the maximal entropy among all probability distributions on a set of finite Lebesgue measures [

Second, a Shannon-equivalent uniform distribution will, by definition, count the number of values (or range of values) of

Hence, the uniform distribution renormalizes the effect of varying relative frequencies (or probabilities) of occurrence of the values of

This calculation (as we have shown elsewhere [

Since, regardless of the scale and units of the original distribution,

In [

Our impetus for making an advance over the Shannon entropy

Given the probability density function

The problem, however, with the Shannon entropy index

The utility of

In order to measure the distribution of diversity, we next need to determine the fractional contribution to overall diversity up to a cumulative probability

The value of

We can then simply calculate the fractional diversity contribution or case-based entropy as

It is at this point that the renormalization (

To check the validity of our formalism, we calculate

With our formulation of

We first illustrate our renormalization by applying it to a relatively simple case: that of an ideal gas at temperature

The entropy of

The cumulative probability

Hence,

Equation (

We note that, in (

We now turn to the calculation of

The cumulative probability

As we would hope, (

However, it is difficult to solve (

Thus,

Although the temperature independence of this distribution is not immediately evident from Figure

In addition, it is worth noting that as we might expect, adding more degrees of freedom increases the average energy by a factor of

We now move on to consider the second of our example distributions. The Bose-Einstein distribution gives the energy probability density function for massive bosons above the Bose temperature

For massless bosons such as photons, the energy probability density function is [

The conditional probabilities, conditional entropies, true diversities, and case-based entropies for these distributions cannot be calculated analytically but can be calculated numerically. The results of such calculations, using the software Matlab, are shown in Figure

As with the Boltzmann distributions, we find that the distributions of diversity for the two boson systems are independent of temperature. Although the curves for the two types of boson are very similar, it is evident that the distributions of diversity do differ to some extent. For helium-4 bosons, a slightly larger fraction of particles are contained in lower diversity energy states than is the case for photons, with

The final distribution we use to illustrate our approach is the Fermi-Dirac distribution:

The Fermi-Dirac distribution differs from the previous examples in that it is not simply scaled by changes in energy. Instead, its shape changes, transforming from a skewed-left distribution, with a sharp cut-off at the Fermi energy at low temperatures, to a smooth, skewed-right distribution at high temperatures. Thus, unlike the situation for Boltzmann and Bose-Einstein distributions, one would expect the distributions of diversity for fermions such as electrons to be dependent on temperature. Figure

Diversity curves for sodium electrons at a range of temperatures with

This figure shows that the degree of diversity is the highest for fermions at low temperatures; for example, at

With our renormalization complete for all three distributions, we sought next to demonstrate, albeit somewhat superficially, the utility of

In Figures

Energy density curves.

Energy density curves for Maxwell-Boltzmann 3D, Bose-Einstein Helium, Bose-Einstein Photon, and Fermi-Dirac Na 6000 K

Energy density curves for Maxwell-Boltzmann 3D, Bose-Einstein Helium, Bose-Einstein Photon, and Fermi-Dirac Na 15000000 K

However, comparison of the diversity distributions suggests that even when the energy probability density functions appear to coincide, significant physical differences remain between the systems. Figure

Superposition of all diversity curves for Boltzmann 1D, Boltzmann 3D, Bose-Einstein Helium, Bose-Einstein Photon, and Fermi-Dirac Na at 2.7 K, 300 K, 6000 K, and 15000000 K.

It is clear from Figure

Thus, the transformation from the usual probability distribution to a distribution of case-based entropy (

As we have hopefully shown in this paper, while Shannon entropy

To address these limitations, we introduced a renormalization of probability distributions based on the notion of

With our conceptualization of

The renormalization obtained will have a different shape for different distributions. In fact, a bimodal, right skewed, or other kinds of distributions will lead to a different

The authors declare that there are no conflicts of interest regarding the publication of this paper.

The authors would like to thank the following colleagues at Kent State University: (