Poisson Disc Sampling
Imagine that you are playing a game and you are flying on a chopper across the amazon jungle. What would be your view? Or if you are playing age of empires. How are those trees uniformly spaced? When you see an animated furry animal, the hairs are uniformly spaced for your vision. How do they animate those trees or uniformly spaced objects using code? The most widely used solution is the “Poisson Disc Sampling” technique. Algorithms are a fascinating use case for visualization; And Visualization is more than a tool for finding patterns in data. It leverages the human visual system to augment human intellect! Light — electromagnetic radiation — the light emanating from this screen, traveling through the air, focused by your lens and projected onto the retina — is a continuous signal. To be perceived, we must reduce light to discrete impulses by measuring its intensity and frequency distribution at different points in space. This reduction process is called sampling, and it is essential to vision. It has recently been suggested that dyslexia may manifest as a deficit in the neural synchrony underlying language-based codes such that the phonological deficits apparent in dyslexia occur as a consequence of poor synchronization of oscillatory brain signals to the sounds of language. Visual coding of text requires large populations of neurons to be synchronizing and synthesizing information extremely quickly, over disparate cortical areas to form coherent percepts, in order for the eyes to recognize an object. Sampling is further a core concern of computer graphics; for example, to rasterize a 3D scene by raytracing, we must determine where to shoot rays. Even resizing an image requires sampling. The samples should be evenly distributed without any aliasing.
The human retina has a beautiful solution to sampling in its placement of photoreceptor cells. The cells cover the retina densely and evenly (with the exception of the blind spot over the optic nerve), and yet the cells’ relative positions are irregular. This is called a Poisson-disc distribution because it maintains a minimum distance between cells, avoiding occlusion and thus wasted photoreceptors. Nature is miraculous! To think of whats going inside our body and their functionalities is way complex than to add functionalities to our code.
Poisson disc Sampling:
In a Poisson disk sample set, no two samples are too close together. The closeness is defined by the Poisson disk radius, which is the half distance between the two closest samples. As compared to regular random sampling, Poisson disk sample sets provide a much more uniform sample distribution over the sampling domain. Beyond providing a nice distribution of sample points, Poisson disk sample sets also provide superior convergence properties with Monte Carlo sampling. The same number of Poisson disk samples typically produces lower noise, as compared to random sampling.
Bridson’s algorithm for Poisson-disc sampling:
Using Pythagora’s theorem, s=r/√2; where s is the side of the small pink colored square that you see on the picture and r is the radius of the circle. Our aim is to generate samples of the circle that we have, such that they don’t overlap with each other and are uniformly spaced. The distance and the dependency of each sample can be figured out. The following is the algorithm proposed by Robert Bridson, called Fast Poisson Disk Sampling in Arbitrary Dimensions.The algorithm takes as input the extent of the sample domain in Rn, the minimum distance r between samples, and a constant k.
Step 0. Initialize an n-dimensional background grid for storing samples and accelerating spatial searches. We pick the cell size to be bounded by r/√n, so that each grid cell will contain at most one sample, and thus the grid can be implemented as a simple n-dimensional array of integers: the default−1 indicates no sample, a non-negative integer gives the index of the sample located in a cell.
Step 1. Select the initial sample, x0, randomly chosen uniformly from the domain. Insert it into the background grid, and initialize the “active list” (an array of sample indices) with this index (zero).
Step 2. While the active list is not empty, choose a random index from it (say i). Generate up to k points chosen uniformly from the spherical annulus between radius r and 2r around xi.
For each point in turn, check if it is within distance r of existing samples (using the background grid to only test nearby samples). If a point is adequately far from existing samples, emit it as the next sample and add it to the active list. If after k attempts no such point is found, instead remove i from the active list.
Weighted Sample Elimination:
Generating Poisson samples has been a real tough task because each sample is dependent on the position of other samples and hence the samples cannot be generated independently. Simple, yet computationally expensive algorithm to produce Poisson disc samples are available, like dart- throwing. On the other hand, computationally efficient algorithms are complex and limited to certain sampling domains. When I was looking upon the solutions and algorithms proposed by people. One of the efficient solutions available is Sample Elimination for Generating Poisson Disk Sample Sets algorithm by Cem Yuksel.
Weighted sample elimination is a simple algorithm for generating Poisson disk sample sets. One additional advantage of it is that it does not require specifying a Poisson disk radius property. It’s input is a collection of (randomly generated) samples and it eliminates a portion of these samples such that the resulting sample set has Poisson disk property. In other words, the weighted sample elimination algorithm receives a large set of input samples and it returns a smaller set of output samples. It does not generate new samples. It merely selects a subset from a given input set.
The input samples can be generated for any sampling domain in any dimensions. The more input samples you provide, the better quality Poisson disk sample set you get, though beyond a certain percentage you get diminishing returns. It is not a good idea to use a very small input sample set, regardless of the output sample set size. For example, if you would merely generate 10 samples, using 50 input samples may not provide a good input sample distribution. In this case, it would be a good idea to start with a much larger input sample set size.
The Sample Elimination for Generating Poisson Disk Sample Sets algorithm allows us to perform weighted sample elimination as a simple greedy algorithm for picking a subset with a reasonably large Poisson disk radius from a given set of input samples. We begin with assigning a weight to each sample based on its distance to its neighbors. At each step we eliminate the sample with the highest weight and adjust the weights of the remaining samples around it. Therefore, an efficient implementation of this algorithm merely needs two relatively common data structures: a spatial partitioning structure for quickly finding the neighboring samples and a priority queue for picking the sample with the highest weight. A kd- tree(an acronym used for k dimensional tree) is used and then a heap and with these two steps elimination of discs that do not have Poisson disc properties, result in the samples that we wanted. The following pseudocode proposed by Cem Yuksel can be used in any programming language of our choice to have Poisson samples. The following picture shows the differences between the Bridon’s algorithm and Cem’s algorithm, which shows the Poisson samples generated using them respectively. The results evident that the samples generated from the latter one are more gorgeous and more evenly distributed.
Psuedocode:
function WeightedSampleElim( samples )
Build a kd-tree for samples
Assign weights wi to each sample si
Build a heap for si using weights wi
while number of samples > desired
sj ← pull the top sample from heap
for each sample si around sj
Remove wi j from wi
update the heap position of si