Contents

Linear Multistep methods

A linear multistep method computes the next solution value from the values at several previous steps:

$$ \alpha_k y_{n+k} + \alpha_{k-1} y_{n+k-1} + \cdots + \alpha_0 y_n = h ( \beta_k f_{n+k} + \cdots + \beta_0 f_n ) $$

Note that different conventions for numbering the coefficients exist; the above form is used in NodePy. Methods are automatically normalized so that \(\alpha_k=1\).

Examples:

>>> import nodepy.linear_multistep_method as lm
>>> ab3=lm.Adams_Bashforth(3)
>>> ab3.order()
3
>>> bdf2=lm.backward_difference_formula(2)
>>> bdf2.order()
2
>>> bdf2.is_zero_stable()
True
>>> bdf7=lm.backward_difference_formula(7)
>>> bdf7.is_zero_stable()
False
>>> bdf3=lm.backward_difference_formula(3)
>>> bdf3.A_alpha_stability()
86
>>> ssp32=lm.elm_ssp2(3)
>>> ssp32.order()
2
>>> ssp32.ssp_coefficient()
1/2
>>> ssp32.plot_stability_region() 
<Figure...

Instantiation

The follwing functions return linear multistep methods of some common types:

  • Adams-Bashforth methods: Adams_Bashforth(k)

  • Adams-Moulton methods: Adams_Moulton(k)

  • backward_difference_formula(k)

  • Optimal explicit SSP methods (elm_ssp2(k))

In each case, the argument \(k\) specifies the number of steps in the method. Note that it is possible to generate methods for arbitrary \(k\), but currently for large \(k\) there are large errors in the coefficients due to roundoff errors. This begins to be significant at 7 steps. However, members of these families with many steps do not have good properties.

More generally, a linear multistep method can be instantiated by specifying its coefficients \(\alpha, \beta\):

>> from nodepy import linear_multistep_method as lmm
>> my_lmm=lmm.LinearMultistepMethod(alpha,beta)

Adams-Bashforth Methods

linear_multistep_method.Adams_Bashforth()

Construct the k-step, Adams-Bashforth method. The methods are explicit and have order k. They have the form:

$$ y_{n+1} = y_n + h \sum_{j=0}^{k-1} \beta_j f(y_{n-k+j+1}) $$

They are generated using equations (1.5) and (1.7) from [HNorW93] III.1, along with the binomial expansion.

Examples:

>>> import nodepy.linear_multistep_method as lm
>>> ab3=lm.Adams_Bashforth(3)
>>> print(ab3)
3-step Adams-Bashforth
Explicit 3-step method of order 3
alpha = [ 0      0      -1     1     ]
beta  = [ 5/12   -4/3   23/12  0     ]

>>> ab3.order()
3

Reference: [HNorW93]

Adams-Moulton Methods

linear_multistep_method.Adams_Moulton()

Construct the \(k\)-step, Adams-Moulton method. The methods are implicit and have order \(k+1\). They have the form:

$$ y_{n+1} = y_n + h \sum_{j=0}^{k} \beta_j f(y_{n-k+j+1}) $$

They are generated using equation (1.9) and the equation in Exercise 3 from Hairer & Wanner III.1, along with the binomial expansion.

Examples:

>>> import nodepy.linear_multistep_method as lm
>>> am3=lm.Adams_Moulton(3)
>>> am3.order()
4

Reference: [HNorW93]

Backward-difference formulas

linear_multistep_method.backward_difference_formula()

Construct the \(k\)-step backward differentiation method. The methods are implicit and have order \(k\).

They have the form:

$$ \sum_{j=0}^{k} \alpha_j y_{n+k-j+1} = h \beta_j f(y_{n+1}) $$

They are generated using equation (1.22’) from Hairer & Wanner III.1, along with the binomial expansion.

Examples:

>>> import nodepy.linear_multistep_method as lm
>>> bdf4=lm.backward_difference_formula(4)
>>> bdf4.A_alpha_stability()
73

Reference: [HNorW93] pp. 364-365

Optimal Explicit SSP methods

linear_multistep_method.elm_ssp2()

Returns the optimal SSP \(k\)-step linear multistep method of order 2.

Examples:

>>> import nodepy.linear_multistep_method as lm
>>> lm10=lm.elm_ssp2(10)
>>> lm10.ssp_coefficient()
8/9

Stability

Characteristic Polynomials

LinearMultistepMethod.characteristic_polynomials()[source]

Returns the characteristic polynomials (also known as generating polynomials) of a linear multistep method. They are:

$$ \rho(z) = \sum_{j=0}^k \alpha_k z^k $$

$$ \sigma(z) = \sum_{j=0}^k \beta_k z^k $$

Examples:

>>> from nodepy import lm
>>> ab5 = lm.Adams_Bashforth(5)
>>> rho,sigma = ab5.characteristic_polynomials()
>>> print(rho)
   5     4
1 x - 1 x

>>> print(sigma)
      4         3         2
2.64 x - 3.853 x + 3.633 x - 1.769 x + 0.3486

Reference: [HNorW93] p. 370, eq. 2.4

Plotting The Stability Region

LinearMultistepMethod.plot_stability_region(N=100, bounds=None, color='r', filled=True, alpha=1.0, to_file=False, longtitle=False)[source]

The region of absolute stability of a linear multistep method is the set

$$ \{ z \in C \mid \rho(\zeta) - z \sigma(\zeta) \text{ satisfies the root condition} \} $$

where \(\rho(\zeta)\) and \(\sigma(\zeta)\) are the characteristic functions of the method.

Also plots the boundary locus, which is given by the set of points z:

$$ \{z \mid z = \rho(\exp(\imath \theta)) / \sigma(\exp(\imath\theta)), 0 \le \theta \le 2\pi \} $$

Here \(\rho\) and \(\sigma\) are the characteristic polynomials of the method.

Reference: [LeV07] section 7.6.1

Input: (all optional)
  • N – Number of gridpoints to use in each direction

  • bounds – limits of plotting region

  • color – color to use for this plot

  • filled – if true, stability region is filled in (solid); otherwise it is outlined