HOME

TheInfoList



OR:

The iterative rational Krylov algorithm (IRKA), is an iterative algorithm, useful for model order reduction (MOR) of
single-input single-output In control engineering, a single-input and single-output (SISO) system is a simple single variable control system with one input and one output. In radio it is the use of only one antenna both in the transmitter and receiver. Details SISO syste ...
(SISO) linear time-invariant
dynamical system In mathematics, a dynamical system is a system in which a Function (mathematics), function describes the time dependence of a Point (geometry), point in an ambient space. Examples include the mathematical models that describe the swinging of a ...
s. At each iteration, IRKA does an Hermite type interpolation of the original system transfer function. Each interpolation requires solving r shifted pairs of
linear system In systems theory, a linear system is a mathematical model of a system based on the use of a linear operator. Linear systems typically exhibit features and properties that are much simpler than the nonlinear case. As a mathematical abstraction o ...
s, each of size n \times n; where n is the original system order, and r is the desired reduced model order (usually r \ll n). The algorithm was first introduced by Gugercin, Antoulas and Beattie in 2008. It is based on a first order necessary optimality condition, initially investigated by Meier and Luenberger in 1967. The first convergence proof of IRKA was given by Flagg, Beattie and Gugercin in 2012, for a particular kind of systems.


MOR as an optimization problem

Consider a SISO linear time-invariant dynamical system, with input v(t), and output y(t): : \begin \dot(t) = A x(t) + b v(t)\\ y(t) = c^T x(t) \end \qquad A \in \mathbb^, \, b,c \in \mathbb^n, \, v(t),y(t) \in \mathbb, \, x(t) \in \mathbb^n. Applying the
Laplace transform In mathematics, the Laplace transform, named after its discoverer Pierre-Simon Laplace (), is an integral transform In mathematics, an integral transform maps a function from its original function space into another function space via integra ...
, with zero initial conditions, we obtain the transfer function G, which is a fraction of polynomials: :G(s)=c^T (sI-A)^ b, \quad A \in \mathbb^, \, b,c \in \mathbb^n. Assume G is stable. Given r < n, MOR tries to approximate the transfer function G, by a stable rational transfer function G_r, of order r: : G_r(s) = c_r^T (sI_r-A_r)^ b_r, \quad A_r \in \mathbb^, \, b_r, c_r \in \mathbb^r. A possible approximation criterion is to minimize the absolute error in H_ norm: :G_ \in \underset \, G-\hat\, _, \quad \, G\, _^2:= \frac \int \limits_^\infty , G(ja), ^2 \, da . This is known as the H_ optimization problem. This problem has been studied extensively, and it is known to be non-convex; which implies that usually it will be difficult to find a global minimizer.


Meier–Luenberger conditions

The following first order necessary optimality condition for the H_ problem, is of great importance for the IRKA algorithm. Note that the poles \lambda_i(A_r) are the
eigenvalues In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
of the reduced r \times r matrix A_r.


Hermite interpolation

An Hermite interpolant G_r of the rational function G, through r distinct points \sigma_1, \ldots, \sigma_r \in \mathbb, has components: : A_r = W_r^* A V_r, \quad b_r = W_r^* b, \quad c_=V_r^* c, \quad A_r \in \mathbb^, \, b_r \in \mathbb^r, \, c_r \in \mathbb^r; where the matrices V_r = ( v_1 \mid \ldots \mid v_r ) \in \mathbb^ and W_r = ( w_1 \mid \ldots \mid w_r ) \in \mathbb^ may be found by solving r dual pairs of linear systems, one for each shift heorem 1.1 :(\sigma_i I-A) v_i=b, \quad (\sigma_i I-A)^* w_i=c, \quad \forall \, i=1,\ldots,r .


IRKA algorithm

As can be seen from the previous section, finding an Hermite interpolator G_r of G, through r given points, is relatively easy. The difficult part is to find the correct interpolation points. IRKA tries to iteratively approximate these "optimal" interpolation points. For this, it starts with r arbitrary interpolation points (closed under conjugation), and then, at each iteration m, it imposes the first order necessary optimality condition of the H_2 problem: 1. find the Hermite interpolant G_r of G, through the actual r shift points: \sigma_1^m,\ldots,\sigma_r^m. 2. update the shifts by using the poles of the new G_r: \sigma_i^ = -\lambda_i(A_r), \, \forall \, i=1,\ldots,r . The iteration is stopped when the relative change in the set of shifts of two successive iterations is less than a given tolerance. This condition may be stated as: :\frac < \text, \, \forall \, i=1,\ldots,r . As already mentioned, each Hermite interpolation requires solving r shifted pairs of linear systems, each of size n \times n: : (\sigma_i^m I-A) v_ = b, \quad (\sigma_i^m I-A)^* w_i = c, \quad \forall \, i=1,\ldots,r . Also, updating the shifts requires finding the r poles of the new interpolant G_r. That is, finding the r eigenvalues of the reduced r \times r matrix A_r.


Pseudocode

The following is a pseudocode for the IRKA algorithm lgorithm 4.1 algorithm IRKA input: A,b,c, \text>0, \sigma_1,\ldots,\sigma_r closed under conjugation (\sigma_i I-A)v_i=b, \, \forall \, i=1,\ldots,r % Solve primal systems (\sigma_i I-A)^* w_i=c, \, \forall \, i=1,\ldots,r % Solve dual systems while relative change in > tol A_ = W_r^* AV_r % Reduced order matrix \sigma_i = -\lambda_i(A_r), \, \forall \, i=1,\ldots,r % Update shifts, using poles of G_ (\sigma_i I-A)v_i=b, \, \forall \, i=1,\ldots,r % Solve primal systems (\sigma_i I-A)^w_=c, \, \forall \, i=1,\ldots,r % Solve dual systems end while return A_r=W_r^* AV_r, \, b_r=W_r^b, \, c_r^T=c^T V_r % Reduced order model


Convergence

A SISO linear system is said to have symmetric state space (SSS), whenever: A=A^, \, b=c . This type of systems appear in many important applications, such as in the analysis of RC circuits and in inverse problems involving 3D
Maxwell's equations Maxwell's equations, or Maxwell–Heaviside equations, are a set of coupled partial differential equations that, together with the Lorentz force law, form the foundation of classical electromagnetism, classical optics, and electric circuits. ...
. For SSS systems with distinct poles, the following convergence result has been proven: "IRKA is a locally convergent fixed point iteration to a local minimizer of the H_ optimization problem." Although there is no convergence proof for the general case, numerous experiments have shown that IRKA often converges rapidly for different kind of linear dynamical systems.


Extensions

IRKA algorithm has been extended by the original authors to
multiple-input multiple-output In radio, multiple-input and multiple-output, or MIMO (), is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multipath propagation. MIMO has become an essential element of wir ...
(MIMO) systems, and also to discrete time and differential algebraic systems emark 4.1


See also

Model order reduction


References

{{Reflist


External links


Model Order Reduction Wiki
Numerical analysis Mathematical modeling