Tomás Garza(firstname.lastname@example.org) writes: In TMJ 6(3):16--17 random numbers with a Poisson distribution are generated using
However, this is rather slow. We can use RandomArray.
RandomArray is useful because, for certain distributions, it is more efficient to produce a whole array of random values. However, trying
we find no speed improvement. Using Random directly is much faster. Consider producing exponentially distributed random numbers with unit mean.
Here RandomArray is much faster.
Alternatively, for the exponential distribution, the inverse cumulative distribution function (CDF) is
can be used to generate exponentially distributed random numbers.
This is about the same speed as using RandomArray.
In a Poisson process, the time interval between two successive occurrences is exponentially distributed (see ). A simple relationship between the Poisson and exponential distributions is implied: add as many independent and identically distributed (i.i.d.) exponential random variables as needed to exceed . The number of terms minus 1 follows a Poisson distribution with the same parameter. The code
is over five times faster than using RandomArray directly.
A general and powerful alternative uses the inverse CDF. Note that this method is not restricted to the Poisson distribution (see  for a discussion of the more general problem of random sampling from any discrete distribution). For the Poisson distribution, the CDF is the following.
For mean , here is a plot of the CDF.
To compute the inverse CDF, we make use of the following procedure. Here are the step values of the CDF for (cf. previous plot).
We can extract the cumulative frequencies from this table.
An elegant way to define the "inverse" of a step function is to use Interpolation with InterpolationOrder -> 0, [suggested by Jens-Peer Kuska (email@example.com)].
This method is around 50 times faster than RandomArray!
The standard statistical approach for testing equality of two distributions proceeds along the following lines. The cumulative relative frequencies (also known as the empirical distribution function, or EDF) of the direct (directvalues) and the approximate sampling procedures (inversevalues) are given by the following two expressions, respectively.
The following table, displayed using PrettyTable (see the section "Formatted Tables"), summarizes the results. The first column gives the values of the Poisson CDF, while the second and third columns give the EDFs of the values obtained through the direct and the approximate sampling procedures, respectively.
Visual inspection shows a good agreement between the experimental results for the two procedures and the Poisson CDF. The maximum distance between and the Poisson CDF is
The corresponding distance for is
In order to test whether these observed distances simply reflect the randomness inherent in the sampling process, an experiment is conducted to estimate their probabilistic behavior. A sample of 10,000 Poisson random numbers is obtained and their EDF constructed. The maximum distance, say , between this EDF and the Poisson CDF is then recorded, and the procedure is repeated a large number of times, e.g., 1,000. The EDF of the 1,000 values of thus obtained provides an estimate of the probability distribution of the maximum distances which arise in the sampling process. One may then judge whether a given distance is too large to be considered as a natural product of randomness.
The experiment is conducted thus: we first compute the exact cumulative frequencies which (implicitly) set the (arbitrary) length of the EDF obtained in each repetition of the experiment.
The behavior of the maximum distances between sampling EDFs and the Poisson CDF,
appears in the graph below.
Here we see that experimentally obtained distances in samples of size 10,000 from the Poisson distribution are seldom larger than 0.0092 (only in 10% of the cases) or 0.0105 (only in 5% of the cases). Both distances and fall comfortably within the interval . This indicates that both and are compatible with the hypothesis H: the samples they are constructed from were drawn from a Poisson distribution.
This testing procedure is essentially equivalent to the Kolmogorov--Smirnov test of statistical theory (see, for example, [3, chapt. 9] or ). However, here we have estimated, through a simulation technique, the EDF of maximum distances appropriate for our specific problem, instead of using a general-purpose table of values as is the case in conventional statistics. The Kolmogorov--Smirnov table yields the values 0.0125 and 0.014 as those which would be exceeded in 90% and 95% of the cases (cf. the corresponding 0.0092 and 0.0105 displayed above) for a sample size of 10,000, respectively. The test procedure outlined here is therefore more precise, in the sense that it requires a closer fit between the two distributions in order to accept the hypothesis.
Converted by Mathematica May 8, 2000