# Theory & Python examples¶

## 2-point probability / auto-correlation¶

### Theory¶

The most basic statistic determines the typical distance over which two points (pixels, voxels, …) are related to each other. This is best understood by considering a binary 2D image, wherein each pixel is either black or white. This is described by the following indicator function, indicating the ‘greyscale’ of a pixel at position : The 2-point probability, , is the probability that two points, at a certain distance are both white. I.e. Two limits can directly be identified. If the two points coincide, the probability that both points are white is the same as the probability that one point is white: the (volume) fraction of white, . I.e. If the two points are completely uncorrelated, when the points are far apart, each point has a probability to be white, and thus In between these extremes, decays from towards the asymptotic value of .

The greyscale generalisation is the auto-correlation corresponds to a local product: Along the same arguments, limit values can be obtained. In this case: where the brackets denotes the spatial average.

### Example¶

This result is based on a simple, periodic, image comprising circular white inclusions embedded in a black background. The top row shows the image and the results for the binary image: from left to right: the image, the 2-point probability in two dimensions, and a cross-section of this result in the middle of the region-of-interest along the horizontal axis. The same image and results are shown on the bottom row for a greyscale image, for which noise is added and the background and the islands are made grey. This example is based on the following code (the code used for the plotting is included in the download, whereby the matplotlib-style can be installed by installing goosempl (for example using pip install goosempl).


import GooseEYE as eye
import numpy    as np

# binary image + correlation
# --------------------------

# generate image, store 'volume-fraction'
I   = eye.dummy_circles((500,500))
phi = np.mean(I)

# 2-point probability
S2  = eye.S2((101,101),I,I)

# gray image + correlation
# ------------------------

# convert to gray-scale image and introduce noise
Igr  = np.array(I,copy=True).astype(np.float)
Igr += 0.1*(2.0*np.random.random(Igr.size)-1.0).reshape(Igr.shape)+0.1
Igr /= 1.2

# 2-point correlation ('auto-correlation')
S2gr = eye.S2((101,101),Igr,Igr)

# correlation bounds: mean intensity squared and mean of the intensity squared
Iav_sq = np.mean(Igr)**2.
Isq_av = np.mean(Igr**2.)



This function also has the possibility to mask certain pixels. The image’s mask is a binary matrix of exactly the same shape as the image. For each pixel in the mask with value 1, the corresponding pixel in the image is ignored. The normalisation is corrected for the reduced amount of data points, whereby the number of data points is no longer constant over the region-of-interest.

Note

For non-periodic images a mask in conjunction with padding of the image can be used to incorporate the full image in the statistic. Otherwise a boundary region is skipped, reducing the amount of information from the boundary region. import GooseEYE as eye
import numpy    as np

# binary image + correlation
# --------------------------

# generate image, store 'volume-fraction'
I   = eye.dummy_circles((500,500))
phi = np.mean(I)

# 2-point probability
S2  = eye.S2((101,101),I,I)

# artefact + (masked) correlation
# -------------------------------

# define image with artefact and the corresponding mask
Ierr            = np.array(I      ,copy=True   )
Ierr[:150,:150] = 1

# 2-point correlation on image with artefact (no mask)
S2err  = eye.S2((101,101),Ierr,Ierr)

# 2-point correlation on image with artefact, with artefact masked



## 2-point cluster function¶

### Theory¶

If an image consists of isolated clusters, the 2-point cluster function can be used to quantify the probability that two points are in the same cluster. It is defined as follows: whereby is an indicator with a unique non-zero index for each cluster.

### Example¶ The 2-point cluster function can be computed with the same machinery as the 2-point probability. The former uses an integer image, with a unique integer number to label each cluster. The latter simply uses binary values.


import GooseEYE as eye
import numpy    as np

# generate image, store 'volume-fraction'
I   = eye.dummy_circles((500,500))
phi = np.mean(I)

# 2-point probability
S2  = eye.S2((101,101),I,I)

# determine clusters, based on the binary image
C   = eye.clusters(I)

# 2-point cluster function
C2  = eye.S2((101,101),C,C)



## Lineal path function¶

### Theory¶

The 2-point cluster function has a first order notion of connectedness. To quantify the true connectedness along a path, the lineal path function is used. The lineal path function quantifies the probability that an entire path of pixels connecting and is in the same phase: In practice the probability is constructed by starting from each pixel for which ‘walking’ along a pixel path until the edge of the inclusion is reached at . The value of is increased for all the relative positions that have been passed along the path connecting and . This is then repeated for all possible directions (with each their own path).

Two limit values can again be identified. At zero distance, the volume fraction is again found: Furthermore it is highly unlikely that a path can be found through the inclusion phase to a relative position very far away. I.e. An important ingredient of the computation of is thus the choice of the pixel paths. In GooseEYE the paths are constructed between the centre of the region of interest and each of the points on the end of the region of interest. The paths can be computed using different algorithms, illustrated below: ### Example¶ import GooseEYE as eye
import numpy    as np

# generate image, store 'volume-fraction'
I   = eye.dummy_circles((500,500))
phi = np.mean(I)

# lineal path function
L   = eye.L((101,101),I)



## Weighted correlation¶

### Theory¶

The weighted correlation characterised the average indicator around high weight factor .

Mathematically the weighted correlation reads Additionally pixels can be masked, for instance to ignore everywhere where is non-zero. The masked correlation reads where all pixels where are ignored; all pixels for which are considered as normal.

### Example¶ import GooseEYE as eye
import numpy    as np

# image + "damage" + correlation
# ------------------------------

# square grid of circles
N         = 15
M         = 500
row       = np.linspace(0,M,N)
col       = np.linspace(0,M,N)
(row,col) = np.meshgrid(row,col)
row       = row.reshape(-1)
col       = col.reshape(-1)
r         = float(M)/float(N)/4.*np.ones((N*N))
# random perturbation
row      += np.random.normal(0.0,float(M)/float(N),N*N)
col      += np.random.normal(0.0,float(M)/float(N),N*N)
r        *= np.random.random(N*N)*2.+0.1
# generate image, store 'volume-fraction'
I         = eye.dummy_circles((M,M),row.astype(np.int),col.astype(np.int),r.astype(np.int))
phi       = np.mean(I)

# create 'damage' -> right of inclusion
col      += 1.1*r
r        *= 0.4
W         = eye.dummy_circles((M,M),row.astype(np.int),col.astype(np.int),r.astype(np.int))
W[I==1]   = 0

# weighted correlation

# gray-scale image + correlation
# ------------------------------

# convert to gray-scale image and introduce noise
Igr       = np.array(I,copy=True).astype(np.float)
Igr      += 0.1*(2.0*np.random.random(Igr.size)-1.0).reshape(Igr.shape)+0.1
Igr      /= 1.2
# mean intensity (for bounds)
Iav       = np.mean(Igr)

# weighted correlation



### Collapse to single point¶

To calculate the probability of the inclusion directly next to a weight site (i.e. the red circles in the example above and below) the ‘collapsed correlation’ is calculated. The distance to the edge of the site, is therefore corrected for as follows: Similarly to the above, a mask may be introduced as follows: #### Example¶ import GooseEYE as eye
import numpy    as np

# square grid of circles
N         = 15
M         = 500
row       = np.linspace(0,M,N)
col       = np.linspace(0,M,N)
(row,col) = np.meshgrid(row,col)
row       = row.reshape(-1)
col       = col.reshape(-1)
r         = float(M)/float(N)/4.*np.ones((N*N))
# random perturbation
row      += np.random.normal(0.0,float(M)/float(N),N*N)
col      += np.random.normal(0.0,float(M)/float(N),N*N)
r        *= np.random.random(N*N)*2.+0.1
# generate image, store 'volume-fraction'
I         = eye.dummy_circles((M,M),row.astype(np.int),col.astype(np.int),r.astype(np.int))
phi       = np.mean(I)

# create 'damage' -> right of inclusion
col      += 1.1*r
r        *= 0.4
W         = eye.dummy_circles((M,M),row.astype(np.int),col.astype(np.int),r.astype(np.int))
W[I==1]   = 0
# identify 'damage' clusters, and identify their centers
clus,cntr = eye.clusterCenters(W)

# weighted correlation: global, and collapsed to cluster centers
WI        = eye.W2 ((101,101),W        ,I,mask=W)



## Obtain clusters¶

### Calculate clusters¶ import GooseEYE as eye
import numpy    as np

# generate image
I = eye.dummy_circles((500,500))

# clusters, centers of gravity
# (both accounting for the periodicity of the image and ignoring it)
C ,ctr  = eye.clusterCenters(I,periodic=False)
CP,ctrP = eye.clusterCenters(I,periodic=True )
size    = np.bincount(C .ravel()).astype(np.float64)/float(C.size)
sizeP   = np.bincount(CP.ravel()).astype(np.float64)/float(C.size)



### Dilate clusters (differently)¶ import GooseEYE as eye
import numpy    as np

# generate image
I        = np.zeros((21,21),dtype='bool')
I[ 4, 4] = True
I[14,15] = True
I[15,15] = True
I[16,15] = True
I[15,14] = True
I[15,16] = True

# clusters
C   = eye.clusters(I,periodic=True)

# dilation settings:
# cluster 1 -> 1 iteration
# cluster 2 -> 2 iterations
itr = np.arange(np.max(C)+1,dtype='int32')

# dilate
CD  = eye.dilate(C,iterations=itr,periodic=True)