Logo

Early warning signals, analyzed with Python

As described, interpreted and analyzed in the chapter on Analyzing critical transitions with Python, Scheffer et al. (2001) suggest the following model as a "minimal model of an ecosystem showing hysteresis"

$\frac{dx}{dt}=a-bx+rf(x)$ with the Hill-function $f(x)=\frac{x^p}{x^p + h^p}$

They use this model to demonstrate a methodology they call Early Warning Signals for detecting critical transitions.

The potential of dynamical systems to shift abruptly from one equilibrium to another triggers huge interest for attempts to prevent system collapses such as market crashes, aprupt climate changes or catastrophic shifts in fish or wildlife populations (Scheffer et al. 2009). The biggest challenge is to predict the “tipping points” where these sudden changes may happen (Scheffer et al. 2009). Several indicators have been pinpointed: critical slowing down, the so called flickering, an increasing autocorrelation, increasing variance and increasing skewness.

  • Scheffer, M. / Carpenter, S. / Foley, J.A. / Folke, C. / Walker, B. (2001). Catastrophic shifts in ecosystems. Nature 413, p. 591-596.
  • Scheffer, M. / et al., Early-warning signals for critical transitions, Nature 461, p. 53 - 59.

To start with, the following Python code generates a plot which shows that this model too has alternative stable states, depending on the parameter $r$ (for the nutrient recycling rate). Other than in the analyzing chapter, we add a small amount of noise to the development to let the behavior become a bit more realistic. The amplitude of this noise is determined by a parameter $\sigma$.

Furthermore, in order to demonstrate different possibilities, the system here is simulated as a differential equation with the numerical integration method odeint of the Python-modul scipy

In [35]:
# import libraries
import numpy as np
from scipy import integrate
import matplotlib.pyplot as plt
# show plots in notebook
% matplotlib inline

#parameters
a = 0.7
b = 1.
r = 0.2
h = 1.  # half-saturation value
# take high p for steep Hill-function
p = 12

# determine strength of noise
sigma = 0.1

# define the system in a function
def Sys(X, t=0):
    # X[0] = x and x[1] = y    
    return np.array([a - b * X[0] + r *(X[0]**p / (X[0]**p + h**p))])

# generate 1000 linearly spaced numbers for x-axes
t = np.linspace(1, 10,  1000)
# initial value for X: 1
Sys0 = np.array([1])

# prepare plots
fig = plt.figure(figsize=(15,5))
ax1 = fig.add_subplot(1,1,1)

# loop through various r values
while r <= 1:
    # integrate system
    X, infodict = integrate.odeint(Sys, Sys0, t, full_output=True)
    #generate noise, sigma * x * dW
    noise = sigma * np.random.normal(0, 0.1, 1000)
    # transform x and y, and add noise
    x = X.T + noise*X.T
    # plot
    ax1.plot(x[0], label = 'r = %0.1f' % r)
    # increment r
    r += 0.2

# determine plotting specifics
ax1.grid()
ax1.legend(loc='best')
ax1.set_xlabel('time')
ax1.set_ylabel('Nutrient load')
ax1.set_title('Dynamics in time (parameter r)')
Out[35]:
<matplotlib.text.Text at 0x1215eb90>

The $tipping point$ at which the system changes its state (at which the lake switches from clear to turbide water) lies at $r=0.6$.

Critical slowing down

The most prominent indicator in the conception of EWS is critical slowing down suggesting that a decreasing rate of recovery from small perturbations predicts the approachment of a tipping point (i.e. a critical transition). The next piece of Python-code generates a plot showing the above development with a small perturbation applied after reaching equilibrium (at $time$ = 500). Testing different $r$-values, the plot shows that recovery time increases with decreasing distance to the tipping point. In this case, in order to be able to intervene, the system is modeled in the form of a difference equation.

In [5]:
#critical slowing down (parameter r)
fig = plt.figure(figsize=(15,5))
ax2 = fig.add_subplot(1,1,1)

# reduce strength of noise to make differences more distinct
sigma = 0.01

x = np.linspace(400, 2000, 1600)
dt = 0.01
r = 0.2
# loop through various r values
while r < 0.6:
    P = []
    P.append(1)
    i = 0
    while (i < 2000):
        # perturb only when equilibrium is reached, here after 500 timesteps
        if i == 500:
            P[i] = P[i - 1] + 0.5
        # the difference equation with noise added
        P.append(P[i] + (a - b * P[i] + r * (P[i]**p / (P[i]**p + h**p))) * dt + sigma * P[i] * np.random.normal(0, 0.1))
        i += 1
    ax2.plot(x, P[400:2000], label = 'r = ' + str(r) + ', distance to tipping point = %0.1f' % (0.6 - r))
    r += 0.1

ax2.grid()
ax2.legend(loc='best')
ax2.set_xlabel('time')
ax2.set_title('Critical slowing down (parameter r)')
Out[5]:
<matplotlib.text.Text at 0xbff2db0>

Flickering

Another noticable Early Warning Signal is a system's back and forth oscillation between two stable states close to a critical transition. This oscillation has been called flickering and was observed among others on the model of lake eutrophication (Wang et al. 2012). In the following calculation we increase $r$ slowly and plot the system at $time$ = 800. in order to show that flickering is noise-induced, we add a second curve (red) to the plot showing the same dynamics without added noise.

  • Wang R. et al. (2012). Flickering gives early warning signals of a critical transition to a eutrophic lake state. Nature 492/2012, p 419 - 22
In [12]:
fig = plt.figure(figsize=(15,5))
ax3 = fig.add_subplot(1,1,1)

# noise
sigma = 0.01
Y = []
Z = []
R = []

dt = 0.01

rr = np.linspace(0,0.2,200)
r = 0.5
for j in rr:
    P = []
    P.append(1)
    PP = []
    PP.append(1)
    i = 0
    while (i < 2000):
        P.append(P[i] + (a - b * P[i] + r * (P[i]**p / (P[i]**p + h**p))) * dt + sigma * P[i] * np.random.normal(0, 0.1))
        # same without noise
        PP.append(PP[i] + (a - b * PP[i] + r * (PP[i]**p / (PP[i]**p + h**p))) * dt)
        i += 1    
    # take only the values for time = 800
    Y.append(P[800])
    Z.append(PP[800])
    r += 0.001
    R.append(r)

ax3.plot(rr, Y, label = 'Nutrient load')    
ax3.plot(rr, Z, 'r-', label = 'Nutrient load without noise')    
ax3.plot(rr, R, label = 'r')
# Setting x-axis ticks label (manually)
ax3.set_xticklabels([0.5, 0.55, 0.6, 0.65, 0.7])
ax3.grid()
ax3.legend(loc='best')
ax3.set_xlabel('r')
ax3.set_title('Flickering')
Out[12]:
<matplotlib.text.Text at 0xe9d9b50>

The next piece of code generates two noisy curves with different $r$-values, one far from tipping point ($r$ = 0.05) and one quite close to it ($r$ = 0.61), of which the green one (for $r$ = 0.05) oscillates, but remains close to the lower equilibrium of the nutrient load at around 0.7, while the blue one (for $r$ = 0.61) jumps repeatedly up and down between equilibria.

In [29]:
fig = plt.figure(figsize = (15, 5))
ax4 = plt.subplot(1,1,1)

# slightly more noise
sigma = 0.3

# two very different r-values, one far before (0.05) and one close after (0.61) tipping point at 0.6
R = [0.05,0.601]

color_style = ['g-','b-']
labels = ['r = 0.05','r = 0.601']

for ri, r in enumerate(R):
    X = []
    T = []
    x = 1
    for t in np.linspace(0, 0.002, 4001):
        delta_x = a - b * x + r *(x**p / (x**p + h**p)) + sigma * p * np.random.normal(0, 0.1)
        x += delta_x/10
        X.append(x), T.append(t)
    
    ax4.plot(T,X, color_style[ri], label = labels[ri])
ax4.grid()
ax4.legend(loc='best')
ax4.set_title('Flickering at two distinct r-values')
Out[29]:
<matplotlib.text.Text at 0x10dbff30>

Increasing variance

As yet other Early Warning Signals increasing variance (respectively standard deviation) and increasing autocorrelation in noise-induced oscillations have been reported. The following piece of code generates a plot that shows how standard deviation increases when $r$ approaches the tipping point.

In [33]:
# using pandas for rolling window
import pandas as pd

fig = plt.figure(figsize=(15,5))
ax5 = fig.add_subplot(1,1,1)

r = 0.4

# very small noise
sigma = 0.0001

x = np.linspace(400, 2000, 1600)
ws = 20  # size of running window
V = []   # variance
S = []   # std
R = []   # r-values

dt = 0.01

while r < 0.59:
    P = []
    PP = []
    M = []
    P.append(1)
    i = 0
    while (i < 2000):
        P.append(P[i] + (a - b * P[i] + r * (P[i]**p / (P[i]**p + h**p))) * dt + sigma * P[i] * np.random.normal(0, 0.1))
        i += 1
    # transform output into a pandas-series for applying rolling window to it
    ppp = pd.Series(P[1000:2000])
    M.append(pd.rolling_mean(ppp.values, ws, min_periods=1))
    V.append(np.var(M[0]))
    S.append(np.std(M[0]))
    R.append(r)
    r += 0.002

# regression-lines
#x = range(len(V))
#fitv = np.polyfit(x,V,1)
#fitv_fn = np.poly1d(fitv) # takes x and returns an estimate for y
xx = range(len(S))
fits = np.polyfit(xx,S,1)
fits_fn = np.poly1d(fits) # takes x and returns an estimate for y
    
#ax5.plot(R, V, 'go', label='variance')
#ax5.plot(R, fitv_fn(x), 'g--', label='regression line for variance')
ax5.plot(R, S, 'ro', label='standard deviation')
ax5.plot(R, fits_fn(xx), '--b', label='regression line for std')
ax5.grid()
ax5.legend(loc='upper left')
ax5.set_xlabel('r')
ax5.set_title('Increasing standard deviation')
Out[33]:
<matplotlib.text.Text at 0x1204ca30>

Increasing Autocorrelation

Autocorrelation is a cross-correlation of data with itself at different points in time. It measures the similarity between observations as a function of the time lag between them. The following two pieces of code consider a time-lag of 1, indicating an increasing self-similarity of system states when approaching the tipping point. While the first piece just shows autocorrelation-values in relation to $r$, the second piece visualizes the autocorrelation of data points in respect to four distinct values of $r$. The plots position data points on the $x$-axes in respect to their values at $t+1$ on the $y$-axes.

In [38]:
fig = plt.figure(figsize=(15,5))
ax6 = fig.add_subplot(1,1,1)

ws = 5  # size of running window
A = []   # autocorrelation
R = []
One = []
r = 0.4

# noise
sigma = 0.01

# loop through various r values
while r <= 0.6:
    M = []    
    # integrate system (refers to first piece of code)
    X, infodict = integrate.odeint(Sys, Sys0, t, full_output=True)
    #generate noise, sigma * x * dW
    noise = sigma * np.random.normal(0, 0.1, 1000)
    # transform x and y, and add noise
    x = X.T + noise * X.T
    # consider data only after reaching equlibrium    
    ppp = pd.Series(x[0][800:1000])
    M.append(pd.rolling_mean(ppp.values, ws, min_periods=1))
    # use numpy's corrcoef-function 
    A.append(np.corrcoef(M[0][:-1],M[0][1:])[1,0])
    R.append(r)
    One.append(1)
    # increment r
    r += 0.002
    
# regression-line
x = range(len(A))
fit = np.polyfit(x,A,1)
fit_fn = np.poly1d(fit) # takes x and returns an estimate for y
    
ax6.plot(R, A, 'ro', label='autocorrelation') 
ax6.plot(R, fit_fn(x), '--b', label='regression line for autocorr')
ax6.plot(R, One, 'g-')
ax6.grid()
ax6.legend(loc='upper left')
ax6.set_ylim(0.8, 1.05)
ax6.set_xlabel('r')
ax6.set_title('Auto-correlation at lag-1')
Out[38]:
<matplotlib.text.Text at 0xec05570>
In [39]:
import matplotlib.gridspec as gridspec

fig = plt.figure(figsize = (15, 15))
gs = gridspec.GridSpec(3, 2)
plot_coor = [[1,0],[1,1],[2,0],[2,1]]

#parameters
R = [0.4,0.45,0.5,0.59]
color_style = ['bo','go','ro','co']
ws = 10  # size of running window

for ri, r in enumerate(R):
    M = []    
    # integrate system
    X, infodict = integrate.odeint(Sys, Sys0, t, full_output=True)
    #generate noise, sigma * x * dW
    noise = sigma * np.random.normal(0, 0.1, 1000)
    # transform x and y, and add noise
    x = X.T + noise * X.T
    
    ppp = pd.Series(x[0][800:1000])
    M.append(pd.rolling_mean(ppp.values, ws, min_periods=1))
    autocorr = np.corrcoef(M[0][:-1],M[0][1:])[1,0]
    
    ax7 = plt.subplot(gs[plot_coor[ri][0],plot_coor[ri][1]])
    ax7.plot(M[0][0:-1], M[0][1:], color_style[ri], label = 'r =' + str(r) + ', autocorr. =' + str(float("{0:.3f}".format(autocorr))))
    ax7.set_xlabel('x', fontsize = 13)
    ax7.set_ylabel('x+1', fontsize = 13)
    ax7.grid()
    ax7.legend(loc=2)

Skewness and Kurtosis

Two more phenomena, observed when approaching a critical transition and thus suggested as Early Warning signals, are changes in the skewness and kurtosis of the distribution of noisy system states. While skewness indicates asymmetry in the distribution - with a negative skew indicating a right-sided concentration (a longer tail on the left side) and a positive skew indicating the opposite - , kurtosis is a measure of the "peakedness" of the distribution - with positive kurtosis indicating a peak higher than the one of a normal distribution and negative kurtosis indicating a lower peak. The following code generates plots showing the devolpment of skewness and kurtosis when $r$ reaches the tipping point and two histograms showing distributions of system states at different values of $r$.

In [40]:
import scipy as sp
from scipy import stats
from scipy.optimize import leastsq
import matplotlib.mlab as mlab

# in a normal distribution skewness is 0 and kurtosis is 

# prepare plots
fig = plt.figure(figsize=(15,10))
ax8 = fig.add_subplot(2,2,1)
ax9 = fig.add_subplot(2,2,2)
ax10 = fig.add_subplot(2,2,3)
ax11 = fig.add_subplot(2,2,4)

r = 0.4
R = []
S = []
K = []

# determine strength of noise
sigma = 0.01

# loop through various r values
while r <= 0.6:
    M = []
    # integrate system
    X, infodict = integrate.odeint(Sys, Sys0, t, full_output=True)
    #generate noise, sigma * x * dW
    noise = sigma * np.random.normal(0, 0.1, 1000)
    # transform x and y, and add noise
    x = X.T + noise * X.T        
    if r >= 0.4 and r <= 0.401:
        x1 = x[0][800:1000]
    if r >= 0.598 and r <= 0.599:
        x2 = x[0][800:1000]
    # doesn't need rolling window
    # use scipy to compoute skewness and kurtosis 
    S.append(sp.stats.skew(x[0][800:1000]))
    K.append(sp.stats.kurtosis(x[0][800:1000]))
    R.append(r)
    # increment r
    r += 0.001

# regression-lines
x = range(len(S))
fits = np.polyfit(x,S,1)
fit_fns = np.poly1d(fits)
fitk = np.polyfit(x,K,1)
fit_fnk = np.poly1d(fitk)

# plot timelines 
ax8.plot(R, S, 'ro') 
ax8.plot(R, fit_fns(x), '--b')
ax8.grid()
ax8.set_xlabel('r')
ax8.set_title('Development of skewness in relation to r')

ax9.plot(R, K, 'ro') 
ax9.plot(R, fit_fnk(x), '--b')
ax9.grid()
ax9.set_xlabel('r')
ax9.set_title('Development of kurtosis in relation to r')

# show distributions at different r-values 
# histogram
n1, bins1, patches1 = ax10.hist(x1, 50, normed=1)
# add a normal-distribution line with same mean and std
y1 = mlab.normpdf(bins1, np.mean(x1), np.std(x1))
ax10.plot(bins1, y1, 'r--', linewidth=3, label = 'Normal distribution')
# add a best fit line - polynomial of degree 4
p1 = np.polyfit(bins1[:-1], n1, 4)
# help(np.polyval)
fitted1 = np.polyval(p1, bins1[:-1])
ax10.plot(bins1[:-1], fitted1, 'c--', linewidth=3, label='Fitted distribution')
ax10.set_title('Distribution at r = 0.4')
# use scipy.stats.describe to calculate (length, (min, max), mean, var, skewness, kurtosis)
d1 = sp.stats.describe(x1)
ax10.plot(d1[2], 0, 'w.', label = 'Skewness = ' + str(d1[4]))
ax10.plot(d1[2], 0, 'w.', label = 'Kurtosis = ' + str(d1[5]))
# use scipy.stats.kurtosistest to calculate the p-value (the second provided value), 
# giving a probability that this is a normal distribution
pn1 = sp.stats.kurtosistest(x1)
ax10.plot(d1[2], 0, 'w.', label = 'P-value = ' + str(pn1[1]))
ax10.legend(loc='lower center', bbox_to_anchor=(0.5, -0.5))

# histogram
n2, bins2, patches2 = ax11.hist(x2, 50, normed=1)
# add a normal-distribution line with same mean and std
y2 = mlab.normpdf(bins2, np.mean(x2), np.std(x2))
ax11.plot(bins2, y2, 'r--', linewidth=3, label = 'Normal distribution')
# add a best fit line - polynomial of degree 4
p2 = np.polyfit(bins2[:-1], n2, 4)
fitted2 = np.polyval(p2, bins2[:-1])
ax11.plot(bins2[:-1], fitted2, 'c--', linewidth=3, label='Fitted distribution')
ax11.set_title('Distribution at r = 0.599')
# use scipy.stats.describe to calculate (length, (min, max), mean, var, skewness, kurtosis)
d2 = sp.stats.describe(x2)
ax11.plot(d2[2], 0, 'w.', label = 'Skewness = ' + str(d2[4]))
ax11.plot(d2[2], 0, 'w.', label = 'Kurtosis = ' + str(d2[5]))
# use scipy.stats.kurtosistest to calculate the p-value (the second provided value), 
# giving a probability that this is a normal distribution
pn2 = sp.stats.kurtosistest(x2)
ax11.plot(d2[2], 0, 'w.', label = 'P-value = ' + str(pn2[1]))
ax11.legend(loc='lower center', bbox_to_anchor=(0.5, -0.5))
Out[40]:
<matplotlib.legend.Legend at 0x1445ced0>