Cumulant
In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have identical cumulants as well, and similarly the cumulants determine the moments.
The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment. But fourth and higher-order cumulants are not equal to central moments. In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments. In particular, when two or more random variables are statistically independent, the nth-order cumulant of their sum is equal to the sum of their nth-order cumulants. As well, the third and higher-order cumulants of a normal distribution are zero, and it is the only distribution with this property.
Just as for moments, where joint moments are used for collections of random variables, it is possible to define joint cumulants.
Definition
The cumulants of a random variable are defined using the cumulant-generating function, which is the natural logarithm of the moment-generating function:The cumulants are obtained from a power series expansion of the cumulant generating function:
This expansion is a Maclaurin series, so the -th cumulant can be obtained by differentiating the above expansion times and evaluating the result at zero:
If the moment-generating function does not exist, the cumulants can be defined in terms of the relationship between cumulants and moments discussed later.
Alternative definition of the cumulant generating function
Some writers prefer to define the cumulant-generating function as the natural logarithm of the characteristic function, which is sometimes also called the second characteristic function,An advantage of —in some sense the function evaluated for purely imaginary arguments—is that is well defined for all real values of even when is not well defined for all real values of, such as can occur when there is "too much" probability that has a large magnitude. Although the function will be well defined, it will nonetheless mimic in terms of the length of its Maclaurin series, which may not extend beyond linear order in the argument , and in particular the number of cumulants that are well defined will not change. Nevertheless, even when does not have a long Maclaurin series, it can be used directly in analyzing and, particularly, adding random variables. Both the Cauchy distribution and more generally, stable distributions are examples of distributions for which the power-series expansions of the generating functions have only finitely many well-defined terms.
Uses in statistics
Working with cumulants can have an advantage over using moments because for statistically independent random variables and,so that each cumulant of a sum of independent random variables is the sum of the corresponding cumulants of the addends. That is, when the addends are statistically independent, the mean of the sum is the sum of the means, the variance of the sum is the sum of the variances, the third cumulant of the sum is the sum of the third cumulants, and so on for each order of cumulant.
A distribution with given cumulants can be approximated through an Edgeworth series.
Cumulants of some discrete probability distributions
- The constant random variables. The cumulant generating function is. The first cumulant is and the other cumulants are zero,.
- The Bernoulli distributions,. The cumulant generating function is. The first cumulants are and. The cumulants satisfy a recursion formula
- The geometric distributions,. The cumulant generating function is. The first cumulants are, and. Substituting gives and.
- The Poisson distributions. The cumulant generating function is. All cumulants are equal to the parameter:.
- The binomial distributions,. The special case is a Bernoulli distribution. Every cumulant is just times the corresponding cumulant of the corresponding Bernoulli distribution. The cumulant generating function is. The first cumulants are and. Substituting gives and. The limiting case is a Poisson distribution.
- The negative binomial distributions,. The special case is a geometric distribution. Every cumulant is just r times the corresponding cumulant of the corresponding geometric distribution. The derivative of the cumulant generating function is K ' = r·−1. The first cumulants are κ1 = K ' = r·, and κ2 = K ' ' = κ1·p−1. Substituting p = −1 gives and. Comparing these formulas to those of the binomial distributions explains the name 'negative binomial distribution'. The limiting case is a Poisson distribution.
the above probability distributions get a unified formula for the derivative of the cumulant generating function:
The second derivative is
confirming that the first cumulant is and the second cumulant is. The constant random variables have. The binomial distributions have so that. The Poisson distributions have. The negative binomial distributions have so that. Note the analogy to the classification of conic sections by eccentricity: circles, ellipses, parabolas, hyperbolas.
Cumulants of some continuous probability distributions
- For the normal distribution with expected value μ and variance σ2, the cumulant generating function is K = μt + σ2t2/2. The first and second derivatives of the cumulant generating function are K ' = μ + σ2·t and K" = σ2. The cumulants are κ1 = μ, κ2 = σ2, and κ3 = κ4 = ... = 0. The special case σ2 = 0 is a constant random variable X = μ.
- The cumulants of the uniform distribution on the interval are κn = Bn/n, where Bn is the n-th Bernoulli number.
- The cumulants of the exponential distribution with parameter λ are κn = λ−n !.
Some properties of the cumulant generating function
where is the cumulative distribution function. The cumulant-generating function will have vertical asymptote at the infimum of such c, if such an infimum exists, and at the supremum of such d, if such a supremum exists, otherwise it will be defined for all real numbers.
If the support of a random variable X has finite upper or lower bounds, then its cumulant-generating function y = K, if it exists, approaches asymptote whose slope is equal to the supremum and/or infimum of the support,
respectively, lying above both these lines everywhere.
For a shift of the distribution by c, For a degenerate point mass at c, the cgf is the straight line, and more generally, if and only if X and Y are independent and their cgfs exist;
The natural exponential family of a distribution may be realized by shifting or translating K, and adjusting it vertically so that it always passes through the origin: if f is the pdf with cgf and is its natural exponential family, then and
If K is finite for a range t1 < Re < t2 then if t1 < 0 < t2 then K is analytic and infinitely differentiable for t1 < Re < t2. Moreover for t real and t1 < t < t2 K is strictly convex, and K
Some properties of cumulants
Invariance and equivariance
The first cumulant is shift-equivariant; all of the others are shift-invariant. This means that, if we denote by κn the n-th cumulant of the probability distribution of the random variable X, then for any constant c:Homogeneity
The n-th cumulant is homogeneous of degree n, i.e. if c is any constant, thenAdditivity
If X and Y are independent random variables then.A negative result
Given the results for the cumulants of the normal distribution, it might be hoped to find families of distributions for whichfor some, with the lower-order cumulants being non-zero. There are no such distributions. The underlying result here is that the cumulant generating function cannot be a finite-order polynomial of degree greater than 2.
Cumulants and moments
The moment generating function is given by:So the cumulant generating function is the logarithm of the moment generating function
The first cumulant is the expected value; the second and third cumulants are respectively the second and third central moments ; but the higher cumulants are neither moments nor central moments, but rather more complicated polynomial functions of the moments.
The moments can be recovered in terms of cumulants by evaluating the n-th derivative of at,
Likewise, the cumulants can be recovered in terms of moments by evaluating the n-th derivative of at,
The explicit expression for the n-th moment in terms of the first n cumulants, and vice versa, can be obtained by using Faà di Bruno's formula for higher derivatives of composite functions. In general, we have
where are incomplete Bell polynomials.
In the like manner, if the mean is given by, the central moment generating function is given by
and the n-th central moment is obtained in terms of cumulants as
Also, for n > 1, the n-th cumulant in terms of the central moments is
The n-th moment μ′n is an nth-degree polynomial in the first n cumulants. The first few expressions are:
The "prime" distinguishes the moments μ′n from the central moments μn. To express the central moments as functions of the cumulants, just drop from these polynomials all terms in which κ1 appears as a factor:
Similarly, the n-th cumulant κn is an n-th-degree polynomial in the first n non-central moments. The first few expressions are:
To express the cumulants for as functions of the central moments, drop from these polynomials all terms in which μ'1 appears as a factor:
To express the cumulants for as functions of the standardized central moments, also set in the polynomials:
The cumulants are also related to the moments by the following recursion formula:
Cumulants and set-partitions
These polynomials have a remarkable combinatorial interpretation: the coefficients count certain partitions of sets. A general form of these polynomials iswhere
- runs through the list of all partitions of a set of size ;
- "" means is one of the "blocks" into which the set is partitioned; and
- is the size of the set.
Cumulants and combinatorics
Further connection between cumulants and combinatorics can be found in the work of Gian-Carlo Rota, where links to invariant theory, symmetric functions, and binomial sequences are studied via umbral calculus.Joint cumulants
The joint cumulant of several random variables X1, ..., Xn is defined by a similar cumulant generating functionA consequence is that
where runs through the list of all partitions of, B runs through the list of all blocks of the partition , and || is the number of parts in the partition. For example,
If any of these random variables are identical, e.g. if X = Y, then the same formulae apply, e.g.
although for such repeated variables there are more concise formulae. For zero-mean random vectors,
The joint cumulant of just one random variable is its expected value, and that of two random variables is their covariance. If some of the random variables are independent of all of the others, then any cumulant involving two independent random variables is zero. If all n random variables are the same, then the joint cumulant is the n-th ordinary cumulant.
The combinatorial meaning of the expression of moments in terms of cumulants is easier to understand than that of cumulants in terms of moments:
For example:
Another important property of joint cumulants is multilinearity:
Just as the second cumulant is the variance, the joint cumulant of just two random variables is the covariance. The familiar identity
generalizes to cumulants:
Conditional cumulants and the law of total cumulance
The law of total expectation and the law of total variance generalize naturally to conditional cumulants. The case n = 3, expressed in the language of moments rather than that of cumulants, saysIn general,
where
- the sum is over all partitions of the set of indices, and
- 1, ..., b are all of the "blocks" of the partition ; the expression κ indicates that the joint cumulant of the random variables whose indices are in that block of the partition.
Relation to statistical physics
A system in equilibrium with a thermal bath at temperature T have a fluctuating internal energy E, which can be considered a random variable drawn from a distribution. The partition function of the system is
where β = 1/ and k is Boltzmann's constant and the notation has been used rather than for the expectation value to avoid confusion with the energy, E. Hence the first and second cumulant for the energy E give the average energy and heat capacity.
The Helmholtz free energy expressed in terms of
further connects thermodynamic quantities with cumulant generating function for the energy. Thermodynamics properties that are derivatives of the free energy, such as its internal energy, entropy, and specific heat capacity, all can be readily expressed in terms of these cumulants. Other free energy can be a function of other variables such as the magnetic field or chemical potential, e.g.
where N is the number of particles and is the grand potential. Again the close relationship between the definition of the free energy and the cumulant generating function implies that various derivatives of this free energy can be written in terms of joint cumulants of E and N.
History
The history of cumulants is discussed by Anders Hald.Cumulants were first introduced by Thorvald N. Thiele, in 1889, who called them semi-invariants. They were first called cumulants in a 1932 paper by Ronald Fisher and John Wishart. Fisher was publicly reminded of Thiele's work by Neyman, who also notes previous published citations of Thiele brought to Fisher's attention. Stephen Stigler has said that the name cumulant was suggested to Fisher in a letter from Harold Hotelling. In a paper published in 1929, Fisher had called them cumulative moment functions. The partition function in statistical physics was introduced by Josiah Willard Gibbs in 1901. The free energy is often called Gibbs free energy. In statistical mechanics, cumulants are also known as Ursell functions relating to a publication in 1927.
Cumulants in generalized settings
Formal cumulants
More generally, the cumulants of a sequence, not necessarily the moments of any probability distribution, are, by definition,where the values of κn for n = 1, 2, 3,... are found formally, i.e., by algebra alone, in disregard of questions of whether any series converges. All of the difficulties of the "problem of cumulants" are absent when one works formally. The simplest example is that the second cumulant of a probability distribution must always be nonnegative, and is zero only if all of the higher cumulants are zero. Formal cumulants are subject to no such constraints.
Bell numbers
In combinatorics, the n-th Bell number is the number of partitions of a set of size n. All of the cumulants of the sequence of Bell numbers are equal to 1. The Bell numbers are the moments of the Poisson distribution with expected value 1.Cumulants of a polynomial sequence of binomial type
For any sequence of scalars in a field of characteristic zero, being considered formal cumulants, there is a corresponding sequence of formal moments, given by the polynomials above. For those polynomials, construct a polynomial sequence in the following way. Out of the polynomialmake a new polynomial in these plus one additional variable x:
and then generalize the pattern. The pattern is that the numbers of blocks in the aforementioned partitions are the exponents on x. Each coefficient is a polynomial in the cumulants; these are the Bell polynomials, named after Eric Temple Bell.
This sequence of polynomials is of binomial type. In fact, no other sequences of binomial type exist; every polynomial sequence of binomial type is completely determined by its sequence of formal cumulants.
Free cumulants
In the above moment-cumulant formulafor joint cumulants,
one sums over all partitions of the set. If instead, one sums only over the noncrossing partitions, then, by solving these formulae for the in terms of the moments, one gets free cumulants rather than conventional cumulants treated above. These free cumulants were introduced by Roland Speicher and play a central role in free probability theory. In that theory, rather than considering independence of random variables, defined in terms of tensor products of algebras of random variables, one considers instead free independence of random variables, defined in terms of free products of algebras.
The ordinary cumulants of degree higher than 2 of the normal distribution are zero. The free cumulants of degree higher than 2 of the Wigner semicircle distribution are zero. This is one respect in which the role of the Wigner distribution in free probability theory is analogous to that of the normal distribution in conventional probability theory.