Historical Prospective: Boltzmann’s versus Planck’s State Counting—Why Boltzmann Did Not Arrive at Planck’s Distribution Law

Why does Planck (1900), referring to Boltzmann’s 1877 probabilistic treatment, obtain his quantum distribution function while Boltzmann did not? To answer this question, both treatments are compared on the basis of Boltzmann’s 1868 three-level scheme (configuration—occupation—occupancy). Some calculations by Planck (1900, 1901, and 1913) and Einstein (1907) are also sketched. For obtaining a quantum distribution, it is crucial to stick with a discrete energy spectrum and to make the limit transitions to infinity at the right place. For correct state counting, the concept of interchangeability of particles is superior to that of indistinguishability.


Introduction
Very recently, Sharp and Matschinsky have translated and commented Boltzmann's famous 1877 paper [1] "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations regarding the Conditions for Thermal Equilibrium" [2].As a matter of fact, they have done a great service to the scientific community 1 .
Barely any of Boltzmann's original scientific work is available in translation.This is remarkable given his central role in the development of both equilibrium and non-equilibrium statistical mechanics, his statistical mechanical explanation of entropy, and our understanding of the Second Law of thermodynamics.What Boltzmann actually wrote on these subjects is rarely quoted directly, his methods are not fully appreciated, and key concepts have been misinterpreted.Yet his work remains relevant today.(Ibid., pp. 1971f.) The paper "exemplifies several of Boltzmann's most important contributions to modern physics.These include (1) The eponymous Boltzmann distribution, relating the energy scaled by the mean kinetic energy (temperature). . .
(2) Much of the theoretical apparatus of statistical mechanics is developed with great clarity. . .His terminology. ..is incisive, in some ways superior to the two modern terms macro-state and micro-state. . .
(4) . ..Boltzmann also clearly demonstrates that there are two distinct contributions to entropy, arising from the distribution of heat (kinetic energy) and the distribution in space of atoms or molecules. . .It is fitting that Boltzmann was the one to discover the third fundamental contribution to entropy, namely radiation, by deriving the Stefan-Boltzmann Law [3]. .." (ibid., pp.1972f.) (5) Boltzmann's "permutability measure", Ω (3/2 of Clausius' entropy, ), is constructed as an extensive quantity."Thus Boltzmann never encounters the apparent Gibbs paradox for the entropy of mixing of identical gases.Furthermore, with Boltzmann's Permutabilitätmass method for counting states, there is no need for a posteriori division by ! to "correct" the derivation using the "somewhat mystical arguments of Gibbs 2 and Planck [4]" nor a need 2 Journal of Thermodynamics to appeal to quantum indistinguishability, which has been implausibly described as the appearance of quantum effects at the macroscopic classical level [5].Subsequently, at least four distinguished practitioners of statistical mechanics have pointed out that correct counting of states a la Boltzmann obviates the need for the spurious indistinguishability/! term: Ehrenfest and Trkal [6], Van Kampen [7], Jaynes [8], Swendsen [9] (and possibly Pauli [10] 3 ).This has had little impact on textbooks of statistical mechanics. 4 An exception is the treatise by Gallavotti [11]" (Ibid., p. 1974).
Indeed, Jaynes [8] argues similarly: "Some important facts about thermodynamics have not been understood by others to this day, nearly as well as Gibbs understood them over 100 years ago [12]. . .For 80 years it has seemed natural that, to find what Gibbs had to say about this, one should turn to his Statistical Mechanics.For 60 years, textbooks and teachers (including, regrettably, the present writer) have impressed upon students how remarkable it was that Gibbs, already in [13], had been able to hit upon this paradox which foretoldand had its resolution only in-quantum theory with its lore about indistinguishable particles, Bose and Fermi statistics, etc. 5 It was therefore a shock to discover that. ..Gibbs [in "Heterogeneous Equilibrium"] displays a full understanding of this problem, and disposes of it without a trace of that confusion over the "meaning of entropy" or "operational distinguishability of particles" on which later writers have stumbled.He goes straight to the heart of the matter as a simple technical detail, easily understood as soon as one has grasped the full meanings of the words "state" and "reversible" as they are used in thermodynamics.In short, quantum theory did not resolve any paradox, because there was no paradox. ... Today, the universally taught conventional wisdom holds that "Classical mechanics failed to yield an entropy function that was extensive, and so statistical mechanics based on classical theory gives qualitatively wrong predictions of vapor pressures and equilibrium constants, which was cleared up only by quantum theory in which the interchange of identical particles is not a real event".We argue that, on the contrary, phenomenological thermodynamics, classical statistics, and quantum statistics are all in just the same logical position with regard to extensivity of entropy; they are silent on the issue, neither requiring it nor forbidding it."Last but not least, Boltzmann's statistical definition of entropy is the first one, which applies to nonequilibrium states, thus "opening the door to the statistical mechanics of non-equilibrium states and irreversible processes" [2, p. 1974].
In this paper, I will concentrate on Boltzmann's manner of state counting and its consequences for classical and quantum statistics.Boltzmann accounts for the interchangeability of equal particles; nevertheless, he does not obtain Planck's distribution function, while Planck-starting with similar probabilistic settings [14,15] or even with the same setting [4]-did.
Moreover, I will sketch some calculations by Einstein [16].In Einstein's pioneering paper about the specific heat of solids, Planck's distribution law emerges from the discreteness of the energy spectrum of Planck's resonators, when compared with the continuous energy spectrum of classical resonators.From this, Einstein concludes quantization to be a selection problem of states.

Level 1: Configurations.
A configuration is the most detailed description of the distribution of the particles over the cells.For each particle,  ( = 1, 2, . . ., ), it provides the number,  ( = 1, 2, . . ., ), of the cell, in which it is located.This can be realized by means of a matrix, M, where   = 1 (0), if particle  is (is not) in cell .This matrix can be condensed into the vector j = ( 1 ,  2 ,  3 , . . .,   ), where   is the number of the cells, in which particle  is located There are altogether   different configurations [17, p. 58].
The configuration is a complete description.However, since the particles are identical, this description is redundant when applied to physical systems like monoatomic gases.The interchange of two atoms does not change the physical properties of the gas.This fact is accounted for in levels 2 and 3.

Level 2: Occupation Numbers.
Occupation numbers represent a condensed description of the distribution of the particles over the cells.It removes the permutation redundancy in the configurations just mentioned.It means that two configurations' vectors, j, which contain the same numbers in merely different sequence-such as j = (1, 2, 2, 3) and j = (2, 1, 2, 3)-are physically equivalent.In other words, relevant is not the complete information: which particle is in a given cell, but only the numbers of particles in the cells.The latter is recorded in theoccupation number, k = ( 1 ,  2 ,  3 , . . .,   ).There are   particles in cell  (1 ≤   ≤ ) The occupation numbers are invariant under any permutation of the particles [17, p. 59].
We have the obvious constraint Altogether, there are different occupation numbers with We are now prepared to compare Boltzmann's and Planck's treatments on an equal footing, in particular, their basic entities, the "complexions."

Boltzmann's 1877 Manner of State Counting
Upon colliding, two molecules may exchange energy, but after the collision both molecules still have one of the above energies.
The total kinetic energy of the gas is  =  = const.

The Kinetic
Then,   tells the number of cells hosting  particles.This is the component   of the occupancy number, z.
(B) In turn, one can reverse the role of molecules and energy portions and interpret that as distribution of the  molecules over the  + 1 energy levels, 0, , 2, . . ., , the probabilistic scheme being ( particles in  cells) = ( molecules on  + 1 energy levels) .
Then,   is the number of particles in cell .This is the component   of the occupation number, k.
The set { 0 ,  1 ,  2 , . . .,   } is subject to the constraints 6 the number of molecules in the gas, and the total energy of the gas in units of .For case (A), they correspond to the constraints (8a) and (8b).For case (B), (1877-1) is identical to (4), while (1877-2) results additionally from the meaning of the cells as energy levels.

Complexions. Boltzmann continues, "
As a preliminary, we will use a simpler schematic approach to the problem, instead of the exact case" (p.1977).The energy levels are distributed in all possible ways among the  molecules, where  =  = const."Any such distribution, in which the first molecule may have a kinetic energy of, e.g., 2, the second may have 6, and so on, up to the last molecule, we call a complexion. .." (p.1977).In other words, a complexion is the set { 1 ,  2 , . . .,   }, where   is the energy of molecule  in units of  (0 ≤   ≤ ).
(A) Literally, this means the distribution of the  + 1 possible amounts of energy, 0, , . . ., , over the  molecules, the probabilistic scheme being Then,   means the number of cells hosting  particles.This is the component   of the occupancy number, z.
(B) In turn, one can reverse the role of molecules and energy levels, again, and understand that as distribution of the  molecules over the  + 1 energy levels, 0, , 2, . . ., , the probabilistic scheme being Then,   means that particle  is in cell   .This is the component   of the configuration, j.Boltzmann's formula (1877-3) below supports this interpretation.Since a distribution of states does not determine kinetic energies exactly, the goal is to describe the state distribution by writing as many zeros [0] as molecules with zero kinetic energy ( 0 ),  1 ones [1] for those with kinetic energy  etc.All these zeros [0], ones [1], etc. are the elements defining the state distribution.(p.1977).
For the reader's convenience, I add Table 2 with the corresponding values of   .
The rows have been regrouped along increasing value of , in order to demonstrate the fact that sequences of   , which differ just in the sequence of numbers, have got the same probability, .There is a "degeneracy" in that different sequences of   yield the same value of , if the number ^=0⋅⋅⋅   ! is the same; see (1877-3) below.

Calculation of the Number of Complexions, 𝑃. "It is now
immediately clear that the number  for each state distribution is exactly the same as the number of permutations of which the elements of the state distribution are capable, and that is why the number  is the desired measure of the permutability of the corresponding distribution of states.Once we have specified every possible complexion, we have also all possible state distributions, the latter differing from  In other words, it is not relevant, which molecule has got which (kinetic) energy, but how many molecules have got a given amount of energy.The molecules having got the same energy are interchangeable, while molecules with different energies are not. 7 Thus, the number  is obtained through permutating the molecules as The denominator arises, "since of the  elements  0 are mutually identical.Similarly with the  1 ,  2 , etc. elements" (p.1979).Formula (1877-3) is isomorphic with formula (6) for the number of configurations for a given occupation number, k.This indicates that Boltzmann's "state distribution," { 0 ,  1 ,  2 , . . .,   }, represents an occupation number, while his complexions are configurations.
However, Boltzmann has already made the limit transition  → ∞, since  ≫ /.This contradicts the condition  ≤ .For this, I will not go into more details.
3.6.Entropy [4].Boltzmann considered the entropy only for the continuum case.For this, I refer to Planck's lectures "Theory of Heat Radiation" 8 , in order to show that Boltzmann's formula (1877-3) yields an extensive entropy.
For large values of   and , Stirling's formula allows for simplifying (1877-3) as Here,  and  are the entropy and the internal energy of the oscillators, Λ is the intensity of the radiation entropy and  is the intensity of the radiation per polarization direction, ] is the radiation frequency, and  is the speed of light in vacuo; the index "0" indicates equilibrium values.The case leads to Wien's radiation formula [19,20].Its simplest generalization reads [20] Using the relation and Wien's displacement law in the form  = (/]) [20], formula (1900b-4) leads to "the two-parametric radiation formula" (Ibid.) It is in agreement with the then available experimental data, in particular, which concerns the differences to Wien's radiation law.Notice that other radiation formulae of that time did so, too [21][22][23], [20, refs. 2, 4 and 5].
The "−1" in the denominator makes the difference to the formulae by Maxwell, Boltzmann, and Wien.

Planck's
Step to Quantum Physics.Immediately after Planck's talk in October 1900, where he presented his novel formula (1900b-6), Rubens and Kurlbaum went to their closely located laboratory to verify his new formula and told him the following Sunday morning that it indeed fits their data clearly better than the formulae by Wien, Thiesen, and Lord Rayleigh ( [24], Fig. 2; cf. also [25]).This brought Planck the most strained weeks of his life to find a physical justification of this formula.Having not found any other way (although being an atomist, he worked solely on continuum theories), he eventually resorted to Boltzmann's 1877 probability approach.Thus, Planck [14] considers a closed system of linear monochromatic resonators weakly interacting with the electromagnetic radiation surrounding them. resonators have got the frequency ],   resonators have the frequency ]  , and so on, where ,   , . . .≫ 1.The question is how is, in a stationary state, the total (field) energy,  tot , distributed among the resonators and the electromagnetic field between them (radiation).
Assume that the set of  resonators with frequency ] has got the field energy , the set of   resonators with frequency ]  the field energy   , and so on.The field energy of all resonators is Now, one has to distribute the energy  among the  resonators of frequency ], and so on.If  is a continuous quantity, there are infinitely many possibilities for that.
The following quotation describes Planck's step to quantum physics."We consider, however-and this is the most essential point of the whole calculation- to consist of a specific number of finite equal parts.For it we use the natural constant ℎ = 6.55 ⋅ 10 −27 (erg⋅s). 11This constant, multiplied with the common oscillation number (frequency), ], yields the energy element, , in erg.Through division of  by  we obtain the number, , of energy elements, which are to be distributed among the  resonators."[14] 4.3.Planck's 1900 Probabilistic Treatment.Referring to Boltzmann, Planck calls a "complexion" the concrete distribution of "energy elements" over resonators.For  = 100 "energy elements" on  = 10 resonators, Planck writes down the following example: Obviously, this complexion, that is, the set { 1 ,  2 , . . .,  10 } = {7, 38, . . ., 5}, corresponds to the occupation number, k, in the probabilistic scheme ( particles in  cells) = ( "energy elements" in  resonators) .
Two complexions are considered to be different, if the numbers in the second row are the same, but in different sequence.
Then, the number of different complexions for this kind of resonators equals ( ≫ 1,  ≫ 1) This formula, that is, is isomorphic with formula (5) for the amount of occupation numbers for the same probabilistic scheme, ( particles in  cells) = ( "energy elements" in  resonators) .
Hence, Planck's complexions (occupation numbers) are not Boltzmann's complexions (configurations).Accordingly, Planck's "permutability measure" (20) is the number of occupation numbers, while Boltzmann's "permutability measure" (1877-3) is the number of configurations for a given occupation number.But this is not the key difference.We will see that there are other possibilities of differentiation between classical and quantum results.Since  is a relative probability (see Boltzmann above), the relative probability for all resonators of all frequencies equals For this, Planck asks for the maximum value,  0 , of  tot over all sets {,   ,   , . ..} obeying condition (1900c-1).
The energy density of the radiation outside the resonators is determined by that of the oscillators as 13

𝑢
Given  tot , this determines the value of  0 , too.The temperature, , is obtained "by means of a second natural constant,  = 1.346 ⋅ 10 −16 (erg/grd), through the equation The product   ln( 0 ) is the entropy of the system of resonators; it is the sum of the entropy of all single resonators."

Planck's Radiation Formula II.
Then, a "hassle-free" calculation leads to the expression It corresponds exactly to the earlier spectral formula (1900b-6).
The corresponding calculations are not provided in Planck's talk [14].I guess that he has proceeded as follows.According to (1900c-4) and (1900c-6), ) . ( On the other hand, the maximization of (replacing ,   , . . .with  ] , ,   , . . .with  ] , and so on) under condition (1900c-1), that is, means being the Lagrangian multiplier.Here, ) . ( This equation complies with (1900c-5) and ( 24) ) . ( By definition,  ] =  ] / ] and In contrast to his 1900 talk, in his 1901 paper, Planck sets immediately for the entropy of the set of  resonators.This agrees with formula (35) below.(Planck, 1913 [4]).According to the second law of thermodynamics, the entropy, , of a system in equilibrium is maximum for a given (total) energy, .The numbers   (or   ≡   /) 14 are thus obtained by means of the variation of   (or   ) in the entropy  and in the conditions for the energy of the oscillator (see Appendix B and cf.(1877-2) above)

Equilibrium Entropy
and the total number of energy quanta (as in (1877-1) above),  This means [4, p. 141] The result is Inserting these   into (1913-173) yields the entropy of the system of oscillators as (1913-222) The thermodynamic relation 1/ = / leads to Planck's distribution law, now including the zero-point energy 15 Hence, Planck obtains his quantum distribution law also through using Boltzmann's "classical," though discrete "permutability measure" (1877-3) and Boltzmann's discrete energy spectrum being actually a quantum spectrum.Boltzmann investigated (31a), (31b), and (31c) with finite sums (finite values of  and ) and sets  → ∞ only at the end.He did not arrive at Planck's formulae (1913-220, 221), but ". ..the probability of having a kinetic energy  is given by" being the mean energy of a molecule (p.1986).This became "the eponymous Boltzmann distribution" [2, p. 1972].Finally, Boltzmann considered his discrete model not to be physically relevant.
The number of complexions at thermodynamic equilibrium is very much larger than the number of complexions at nonequilibrium.For this, the number of all possible complexions is a good approximation to the number of complexions at thermodynamic equilibrium and thus to the maximum "thermodynamic probability,"   .The total number of all possible complexions can be calculated much more easily and directly than the equilibrium value.(See formulae (1877-3), (20), and (1901-5) above.) A complexion is a definite assignment of every individual oscillator to a quantum cell in phase space defined by (1913-210) (see Appendix B). 16 The constraint (1913-219) can be written as 17 where  is the total number of quanta ℎ].The number of complexions equals the number of possibilities to distribute these  quanta over the  oscillators through varying the numbers   .This task represents ""combinations with repetitions of  elements taken  at a time," whose total number is" [4, p. 146] 18    1 ( + 1) 2 (1913-232) The entropy thus equals by Stirling's formula.If one replaces  with  from (1913-231), this agrees exactly with formula (1913-222). 19 Hence, Planck obtains his quantum distribution law also through using his "quantum" "permutability measure" (20) and the same discrete energy spectrum as before.And instead of varying the occupation numbers for maximizing the entropy, S, he replaces  with  according to the energy conservation condition .
In all three variants, Planck uses, as Boltzmann at the beginning, a discrete energy spectrum.For this, let us finally look at Einstein's 1907 derivation of Planck's radiation law.Here, the difference between classical and quantum results is immediately connected with the energy spectrum being continuous and discrete, respectively.

Einstein's 1907 Derivation of Planck's Radiation Law
5.1.Einstein's Probability.Einstein [16] considers a system of molecules, the state of which is determined by the (very many) variables  1 ,  2 , . . .,   .The equations of motion are (I use the index  instead of ]) with 21 Further, there is a subsystem of that system characterized by the variables  1 ,  2 , . . .,   being a subset of   .The energy of the whole system is approximately the sum of one part, which depends solely on   , and a second part, which is independent of   ., the first part, is much smaller than the whole energy.
The probability for   to lie at some time in the infinitesimal domain  1  2 ⋅ ⋅ ⋅   equals (I set / = 1/) Then, Einstein assumes that this formula can be written as Here, function () is defined as where the integration runs over all values of the   that correspond to the energy values between  and  + .
Einstein obtained Planck's formula in the following way.Instead of 0 ≤  < ∞ and () = const, he assumes that the energy of the oscillator is restricted to the values 0, , 2 and so forth.Then, With  = ℎ], this is Planck's result (24).
The simplest imagination about solids is that they are made of atoms or ions, which vibrate around their equilibrium positions like three-dimensional oscillators.If electrically charged, they interact with the electromagnetic radiation like Planck's resonators.Compatibility with Planck's radiation law implies their energy spectrum to be that of Planck's resonators; see the following.

Specific Heat of Solids.
On the other hand, the vibrating ions carry the heat energy of such a model solid.Then, the classical result (E-4) yields for the specific heat the value where  is the number of atoms/ions in the solid, in agreement with Dulong-Petit's rule.
In contrast, the nonclassical result (E-7) yields the formula The specific heat is no longer temperature-independent but decreases exponentially for temperatures being small against ℎ]/.Formula (E-8) is in very good agreement with the then available experimental data for diamond 22 .In bypassing I notice that this brought him the attention of Nernst, who became one of the key persons to get Einstein to Berlin.

Quantization as Selection
Problem.Before concluding this paper, let me point to the following crucial conclusion by Einstein.
. ..we are now compelled, for vibratory ions of a certain frequency, which can mediate an energy exchange between matter and radiation, to make the assumption, that the mannifold [sic] of states they are able to assume is a smaller one when compared with the bodies of our experience.(p.

184)
This defines quantization to be the problem to "select" [26, 27, Ch.I, § 15] the set of quantum states "out of" the set of classical states.
(ii) Bohr [28, eq. ( 9)] has selected a discrete set of energies, too, namely, for the assumed circular motion of the electron in an -atom.
(iii) The additional Sommerfeld-Wilson quantization condition [ 29,30] selects discrete values for the action of the radial motion of that electron taken over the period, , in order to quantize elliptical orbits.More generally, this condition is postulated for multiplying periodic motions ( = 1, 2, . ..),where   and   are appropriate angular-action variables.
(iv) De Broglie [31] assigns to each particle with momentum  the wavelength  = ℎ/ and selects the "resonant" values,  = /,  being the length of the periodic orbit.
All these examples stem from the "old" quantum theory.Within the "new" quantum theory, beginning with Heisenberg's [32] "matrix mechanics" and Schrödinger's [33] "wave mechanics," there seems to be no place for a quantum selection condition.Schrödinger speaks about the quantum conditions as selection conditions [33, pp. 510f.].But they are replaced with "for a physical quantity almost self-evident requirement to the [wave] function, , to be unique, finite and continuous" (Ibid., p. 511).Accordingly, he calls the corresponding boundary conditions "natural" and "intrinsic" to the wave equation (Ibid., p. 512).Nevertheless, an eigenvalue problem represents "classical mathematics" for a classical problem, namely, the vibrations of strings and resonators and the like (notably, standing waves).However, it is possible to solve the stationary Schrödinger equation as a selection rather than an eigenvalue problem [34,35].Thus, Planck and Einstein have selected right subsets, while Boltzmann did not.

Summary and Conclusions
Boltzmann [1] starts from a correct probabilistic setting of the dispersion of energy quanta over molecules, which can lead to the Planck distribution [4,14,15,19,20].He does not arrive at the Planck (and, subsequently, Bose-Einstein) distribution, because he finally considered his discrete model to be not physically relevant.Moreover, he obtained expressions, which do not correspond to equilibrium distributions (p.2003).This could be due to various confusions being common for pioneering work that goes so far ahead. 23 Boltzmann [1] did not succeed with a discrete model of matter and energies for the entropy, but he obtained the allimportant Boltzmann factor.Without it, there is no partition function, and Einstein could not have presented his 1907 calculations in such a short form.

A. Permutation Invariance of Newtonian State Quantities: Avoidance of Gibbs' Paradox
According to the definitions and axioms in Newton's Principia, the state of a body (here always considered as being pointlike) is given by its momentum (vector) (cf.[43], § 1).It is conserved as long as no external force is acting upon it.For a system of two bodies, which interact at most with each another, the total momentum is conserved (Newton's 3rd axiom).The state of the system is thus described by that total momentum.It is not changed, when the two bodies interchange their momenta.If the two bodies are equal in their mechanical properties, the mechanical properties of the system are not changed, if they are interchanged.The generalization to a gas is straightforward.Another conserved quantity for a free body and for an isolated system of bodies is the (total) kinetic energy (envisaged by Leibniz in form of the "living force").
Hence, generally speaking, in the sense of Newton, the state of a system is described by a complete set of independent conserved quantities, such as total momentum, angular momentum, and energy.In contrast, the positions of the bodies do not enter the state description.
The conserved quantities are not affected, when equal bodies are interchanged ("equal" means equal mass, etc., cf.[44]).As a consequence, the Newtonian state quantities are permutation invariant (cf.[17], Remark 2.1.2b,p. 15).This invariance has implicitly been used by Boltzmann and by Planck.
Thus, "identical" should not refer to the (im)possibility of identification, but to the actual impact of interchange on the behaviour of a system.A striking example is provided by the red balls in a snooker game.They are identifiable through their positions on the table; nevertheless, the interchange of two of them does not affect the outcome of the game.
As Feller put it, "Whether or not actual balls are in practice distinguishable is irrelevant for our theory.Even if they are, we may decide to treat them as indistinguishable" ([45, p. 12]; quoted after [17, p. 139]).
Generally speaking, identifiability or distinguishability is not a property of bodies (particles), but of states [17, p. 8].For this, Bach calls identical particle all those, which have got one and the same intrinsic, that is, state-independent properties (Definition 2.1.1,p. 15).This is Helmholtz's notion of "equal particles" (loc.cit.).
Hence, the confusion of understanding these terms as stressed by Jaynes [8] arises largely from a confusion of the notions "equal," "identical," and "interchangeable."(This resembles the history of the notions "force" and "energy".)Their correct use avoids Gibbs' paradox automatically.

B. Average Energy of a Quantum Oscillator [4]
In his lectures on heat radiation, Planck [4, Pt.III, Ch.I] follows Boltzmann [1] in the probabilistic treatment of ideal gases.For oscillators interacting with electromagnetic radiation, however, he proceeds completely differently (Ibid., Pt.III, Ch.III).For the reader's convenience, the calculation of the mean energy is sketched here.
The (total) energy of a linear harmonic oscillator is written in the form (Planck's notation)  where  and  are the appropriate force constant and mass, respectively.Such an oscillator vibrates as The average energy of all   oscillators in the th phase-space region equals the number of oscillators in this region,   , times the average energy of an oscillator in this region:

Disclosure
This paper is taken from a special course on statistical mechanics held at the Kazakh National Pedagogical Abai University, Almaty, in 2015.

Table 1 :
The 15 "state distributions" for the case  = 7,  = 7,  = 7. "The state distributions are so arranged that, read as a number, the rows are arranged in increasing order" (p.1978; I have added the labels of the configuration numbers,  1 , . . .,  7 ).

Planck's Manner of State Counting for Resonators/Oscillators 4
Planck numbers the cells from 1 to  and writes   for   ).This elegant form yields immediately the entropy,  =   ln , as .1.Planck's Radiation Formula I. Planck used the rather exotic 10 quantity [19, eq.(11)] 9(Thus, for classical gases, Planck obtains essentially the same result as Boltzmann, where the cells are now finite domains of the volume, into which the gas molecules are enclosed.What, then, is the difference to the quantum case? 4. ) 4.4.Entropy and Energy Density.The value of  tot depends on the set {,   ,   , . ..}.The corresponding entropy is, up to a constant,  tot =   ln  tot =   (ln  + ln   + ln   + ⋅ ⋅ ⋅) .