Skip to content

Decoupling coupled linear ODEs by constant linear combinations

Coupled systems of linear second-order ordinary differential equations appear everywhere in mathematical physics: coupled oscillators, multi-component wave equations, small perturbations in gravity, and so on. A very natural question is:

When can a coupled system be transformed into independent scalar equations by taking constant linear combinations of the original unknown functions?

Mathematically, the key idea is that decoupling is equivalent to simultaneous diagonalization of a matrix-valued coefficient function Q(x)Q(x) by a constant change of basis.


Consider nn scalar functions yi(x)y_i(x) and a coupled linear system of second-order ODEs of the form

yi(x)+j=1nQij(x)yj(x)=0,i=1,,n.y_i''(x) + \sum_{j=1}^n Q_{ij}(x)\,y_j(x) = 0, \qquad i = 1,\dots,n.

Introduce the vector

y(x)=(y1(x)yn(x)),\mathbf y(x) = \begin{pmatrix} y_1(x)\\ \vdots\\ y_n(x) \end{pmatrix},

and the n×nn\times n matrix

Q(x)=(Qij(x))1i,jn.Q(x) = \bigl(Q_{ij}(x)\bigr)_{1\le i,j \le n}.

Then the system can be written compactly as

y(x)+Q(x)y(x)=0.\mathbf y''(x) + Q(x)\,\mathbf y(x) = 0.

We will look for a constant invertible matrix SGL(n)S\in GL(n) and new unknowns

z(x)=S1y(x),\mathbf z(x) = S^{-1}\mathbf y(x),

such that in the z\mathbf z–variables the system is diagonal:

zi(x)+λi(x)zi(x)=0,i=1,,n.z_i''(x) + \lambda_i(x)\,z_i(x) = 0, \qquad i = 1,\dots,n.

Since SS is constant, y=Sz\mathbf y'' = S\mathbf z'', and the equation becomes

Sz(x)+Q(x)Sz(x)=0,S\mathbf z''(x) + Q(x)S\mathbf z(x) = 0,

or equivalently

z(x)+S1Q(x)Sz(x)=0.\mathbf z''(x) + S^{-1}Q(x)S\,\mathbf z(x) = 0.

Thus:

The system is decoupled in the z\mathbf z–variables if and only if the matrix S1Q(x)SS^{-1}Q(x)S is diagonal for all xx.

Equivalently, there exists a constant basis in which Q(x)Q(x) is always diagonal. This is what we mean by simultaneous diagonalization of the family of matrices {Q(x)}\{Q(x)\} by a single constant matrix SS.


Let us now specialize to n=2n=2 and work everything out explicitly.

Consider

y1(x)+q1(x)y1(x)+r1(x)y2(x)=0,y2(x)+q2(x)y1(x)+r2(x)y2(x)=0,\begin{aligned} y_1''(x) + q_1(x)y_1(x) + r_1(x)y_2(x) &= 0,\\ y_2''(x) + q_2(x)y_1(x) + r_2(x)y_2(x) &= 0, \end{aligned}

where q1,q2,r1,r2q_1,q_2,r_1,r_2 are given functions of xx.

In matrix form,

y(x)+Q(x)y(x)=0,\mathbf y''(x) + Q(x)\,\mathbf y(x) = 0,

with

y(x)=(y1(x)y2(x)),Q(x)=(q1(x)r1(x)q2(x)r2(x)).\mathbf y(x) = \begin{pmatrix} y_1(x)\\ y_2(x) \end{pmatrix}, \qquad Q(x) = \begin{pmatrix} q_1(x) & r_1(x)\\ q_2(x) & r_2(x) \end{pmatrix}.

We want to find constant linear combinations of y1,y2y_1,y_2 that decouple this system.

2.2 Decoupling via a constant change of basis

Section titled “2.2 Decoupling via a constant change of basis”

Suppose there is a constant invertible matrix SS such that

z=S1y\mathbf z = S^{-1}\mathbf y

satisfies

z1(x)+λ1(x)z1(x)=0,z2(x)+λ2(x)z2(x)=0.z_1''(x) + \lambda_1(x)\,z_1(x) = 0,\qquad z_2''(x) + \lambda_2(x)\,z_2(x) = 0.

As explained above, this is equivalent to the existence of a constant SS such that

S1Q(x)S=(λ1(x)00λ2(x))for all x.S^{-1}Q(x)S = \begin{pmatrix} \lambda_1(x) & 0\\ 0 & \lambda_2(x) \end{pmatrix} \quad\text{for all }x.

Thus {Q(x)}\{Q(x)\} must have an xx-independent eigenbasis: there must exist two constant vectors v1,v2v_1,v_2 and scalar functions λ1(x),λ2(x)\lambda_1(x),\lambda_2(x) such that

Q(x)vi=λi(x)vi,i=1,2, for all x.Q(x)v_i = \lambda_i(x)\,v_i,\qquad i=1,2,\ \text{for all }x.

We now derive explicit conditions on the scalar functions q1,q2,r1,r2q_1,q_2,r_1,r_2 that guarantee this.

2.3 Method of constant linear combinations

Section titled “2.3 Method of constant linear combinations”

Instead of working directly with eigenvectors, let us take a more “ODE-centric” view. We search for combinations of the form

u(x)=y1(x)+κy2(x),u(x) = y_1(x) + \kappa\,y_2(x),

with κ\kappa constant, such that u(x)u(x) satisfies a decoupled equation

u(x)+λ(x)u(x)=0,u''(x) + \lambda(x)\,u(x) = 0,

for some coefficient function λ(x)\lambda(x).

From the original system we have

y1=q1y1r1y2,y2=q2y1r2y2.\begin{aligned} y_1'' &= - q_1 y_1 - r_1 y_2,\\ y_2'' &= - q_2 y_1 - r_2 y_2. \end{aligned}

Differentiate uu twice:

u=y1+κy2=(q1+κq2)y1(r1+κr2)y2.u'' = y_1'' + \kappa y_2'' = -(q_1+\kappa q_2)y_1 - (r_1+\kappa r_2)y_2.

We want u+λu=0u'' + \lambda u = 0, i.e.

(q1+κq2)y1(r1+κr2)y2+λ(y1+κy2)=0-(q_1+\kappa q_2)y_1 - (r_1+\kappa r_2)y_2 + \lambda(y_1+\kappa y_2) = 0

for all y1,y2y_1,y_2. Equating coefficients of y1y_1 and y2y_2 gives

λ=q1+κq2,λκ=r1+κr2.\begin{aligned} \lambda &= q_1 + \kappa q_2,\\ \lambda\kappa &= r_1 + \kappa r_2. \end{aligned}

Eliminating λ\lambda yields

(q1+κq2)κ=r1+κr2,(q_1 + \kappa q_2)\,\kappa = r_1 + \kappa r_2,

or

q2(x)κ2+(q1(x)r2(x))κr1(x)=0.q_2(x)\,\kappa^2 + \bigl(q_1(x)-r_2(x)\bigr)\,\kappa - r_1(x) = 0.

This is a quadratic equation in κ\kappa whose coefficients are functions of xx. For a given choice of κ\kappa, we need this identity to hold for all xx. For decoupling, we need two distinct constant solutions κ1,κ2\kappa_1,\kappa_2 (one for each independent scalar mode).

That is only possible if the coefficients are proportional as functions of xx. Assume q2(x)≢0q_2(x)\not\equiv 0 on the interval of interest. Then the condition is:

There exist constants α,β\alpha,\beta such that

r1(x)q2(x)=α,q1(x)r2(x)q2(x)=β,\frac{r_1(x)}{q_2(x)} = \alpha,\qquad \frac{q_1(x)-r_2(x)}{q_2(x)} = \beta,

for all xx where q2(x)0q_2(x)\neq 0.

Equivalently,

r1(x)=αq2(x),q1(x)r2(x)=βq2(x).r_1(x) = \alpha\,q_2(x),\qquad q_1(x) - r_2(x) = \beta\,q_2(x).

Under these conditions, the quadratic simplifies to

q2(x)(κ2+βκα)=0.q_2(x)\left(\kappa^2 + \beta\kappa - \alpha\right) = 0.

The roots of κ2+βκα=0\kappa^2 + \beta\kappa - \alpha = 0,

κ1,2=β±β2+4α2,\kappa_{1,2} = \frac{-\beta\pm\sqrt{\beta^2+4\alpha}}{2},

are constants, independent of xx. Thus we obtain two constant linear combinations

u1=y1+κ1y2,u2=y1+κ2y2,u_1 = y_1 + \kappa_1 y_2,\qquad u_2 = y_1 + \kappa_2 y_2,

each of which satisfies a decoupled second-order ODE

ui(x)+λi(x)ui(x)=0,u_i''(x) + \lambda_i(x)\,u_i(x) = 0,

with λi(x)=q1(x)+κiq2(x)\lambda_i(x) = q_1(x) + \kappa_i q_2(x).

Special cases like q20q_2\equiv 0 can be treated similarly by swapping the roles of y1y_1 and y2y_2 (or equivalently interchanging q2r1q_2\leftrightarrow r_1).

The conditions

r1(x)=αq2(x),q1(x)r2(x)=βq2(x)r_1(x) = \alpha\,q_2(x),\qquad q_1(x) - r_2(x) = \beta\,q_2(x)

have a nice matrix interpretation. Using

Q(x)=(q1(x)r1(x)q2(x)r2(x)),Q(x) = \begin{pmatrix} q_1(x) & r_1(x)\\ q_2(x) & r_2(x) \end{pmatrix},

we can write

Q(x)=(r2(x)+βq2(x)αq2(x)q2(x)r2(x))=r2(x)I+q2(x)M,Q(x) = \begin{pmatrix} r_2(x) + \beta q_2(x) & \alpha q_2(x)\\ q_2(x) & r_2(x) \end{pmatrix} = r_2(x)\,I + q_2(x)\,M,

where

M=(βα10)M = \begin{pmatrix} \beta & \alpha\\ 1 & 0 \end{pmatrix}

is a constant 2×22\times 2 matrix.

Thus every Q(x)Q(x) in this family lies in the two-dimensional commutative algebra

A=span{I,M}.\mathcal A = \mathrm{span}\{I,M\}.

The eigenvectors of MM are constant and provide the desired decoupling. Indeed, if Mvi=μiviM v_i = \mu_i v_i with v1,v2v_1,v_2 linearly independent and constant, then

Q(x)vi=(r2(x)+μiq2(x))vi,Q(x) v_i = \bigl(r_2(x) + \mu_i q_2(x)\bigr)\,v_i,

so v1,v2v_1,v_2 form an eigenbasis of Q(x)Q(x) for all xx.

In summary:

Two-equation system (generic case).
On an interval where q2q_2 does not vanish identically and Q(x)Q(x) has distinct eigenvalues, the system can be decoupled by constant linear combinations if and only if there exist constants α,β\alpha,\beta such that

r1(x)q2(x)=α,q1(x)r2(x)q2(x)=β\frac{r_1(x)}{q_2(x)} = \alpha,\qquad \frac{q_1(x)-r_2(x)}{q_2(x)} = \beta

for all xx in the interval.

The two conditions above are precisely two independent functional constraints among the four coefficient functions q1,q2,r1,r2q_1,q_2,r_1,r_2.


Let us return to the general case

y(x)+Q(x)y(x)=0,y(x)Cn,Q(x)Matn×n(C).\mathbf y''(x) + Q(x)\,\mathbf y(x) = 0, \qquad \mathbf y(x)\in\mathbb C^n,\quad Q(x)\in\mathrm{Mat}_{n\times n}(\mathbb C).

We ask: when can we find a constant invertible SS such that z=S1y\mathbf z = S^{-1}\mathbf y satisfies the decoupled system

zi(x)+λi(x)zi(x)=0,i=1,,n?z_i''(x) + \lambda_i(x)\,z_i(x) = 0,\qquad i=1,\dots,n?

As before, the system in z\mathbf z reads

z(x)+S1Q(x)Sz(x)=0.\mathbf z''(x) + S^{-1}Q(x)S\,\mathbf z(x) = 0.

Thus we require that

S1Q(x)S=diag ⁣(λ1(x),,λn(x))S^{-1}Q(x)S = \mathrm{diag}\!\bigl(\lambda_1(x),\dots,\lambda_n(x)\bigr)

for all xx. Equivalently,

There exist nn linearly independent constant vectors v1,,vnv_1,\dots,v_n and scalar functions λ1(x),,λn(x)\lambda_1(x),\dots,\lambda_n(x) such that

Q(x)vi=λi(x)vi,i=1,,n, for all x.Q(x)\,v_i = \lambda_i(x)\,v_i,\qquad i=1,\dots,n,\ \text{for all }x.

The vectors viv_i form a common eigenbasis for the entire family {Q(x)}\{Q(x)\}, and stacking them as columns of SS gives the desired transformation.

3.1 Projector / Cartan subalgebra viewpoint

Section titled “3.1 Projector / Cartan subalgebra viewpoint”

A convenient way to package this condition is in terms of projectors.

Let viv_i be a common eigenbasis for all Q(x)Q(x), and let wiTw_i^{\mathsf T} be the corresponding dual basis so that wiTvj=δijw_i^{\mathsf T}v_j = \delta_{ij}. Then the rank-one projectors

Pi=viwiTP_i = v_i w_i^{\mathsf T}

satisfy

Pi2=Pi,PiPj=0 (ij),i=1nPi=I.P_i^2 = P_i,\qquad P_iP_j = 0\ (i\ne j),\qquad \sum_{i=1}^n P_i = I.

The matrix Q(x)Q(x) can then be written as

Q(x)=i=1nλi(x)Pi.Q(x) = \sum_{i=1}^n \lambda_i(x)\,P_i.

If the system is decouplable by a constant change of basis, there exist such constant projectors PiP_i and coefficient functions λi(x)\lambda_i(x).

In Lie-algebra language, the set

h=span{P1,,Pn}\mathfrak h = \mathrm{span}\{P_1,\dots,P_n\}

is a Cartan subalgebra of gln\mathfrak{gl}_n, conjugate to the subalgebra of diagonal matrices. The condition above says precisely that

Q(x)Q(x) lies in a fixed Cartan subalgebra for all xx.

Choosing a basis that diagonalizes h\mathfrak h is exactly the constant change of basis SS that decouples the system.

3.2 Commuting families (a useful sufficient condition)

Section titled “3.2 Commuting families (a useful sufficient condition)”

A standard sufficient condition, under mild diagonalizability assumptions, is:

  1. The matrices Q(x)Q(x) are diagonalizable (e.g. Hermitian/symmetric), and
  2. They commute pairwise: [Q(x1),Q(x2)]=0for all x1,x2.[Q(x_1),Q(x_2)] = 0 \quad\text{for all }x_1,x_2.

Then there exists a constant basis of joint eigenvectors, and the system can be decoupled by a constant change of variables.

This is often how decoupling arises in physics: the coupling matrices are all functions of xx built from a fixed set of commuting operators.


4. How many independent conditions for nn equations?

Section titled “4. How many independent conditions for nnn equations?”

In the two-equation case we found two independent relations among q1,q2,r1,r2q_1,q_2,r_1,r_2. For general nn, a simple dimension count tells us how many independent functional constraints are needed.

  • A completely general Q(x)Q(x) is an n×nn\times n matrix of functions: there are n2n^2 independent scalar coefficient functions Qij(x)Q_{ij}(x).

  • If the system is decouplable by a constant SS, then

    Q(x)=SΛ(x)S1,Λ(x)=diag(λ1(x),,λn(x)).Q(x) = S\,\Lambda(x)\,S^{-1},\qquad \Lambda(x) = \mathrm{diag}(\lambda_1(x),\dots,\lambda_n(x)).

    The only xx-dependent data are the nn scalar functions λi(x)\lambda_i(x); the matrix SS itself is constant.

Thus the space of decouplable families Q(x)Q(x) has nn functional degrees of freedom, while a general Q(x)Q(x) has n2n^2 functional degrees of freedom. Therefore, we need

n2nn^2 - n

independent functional constraints on the coefficients Qij(x)Q_{ij}(x) in order for the system to be decouplable into nn scalar second-order equations by constant linear combinations.

For example:

  • n=2n=2: n2n=42=2n^2-n = 4-2 = 2 constraints, matching the two ratio conditions in the two-equation case.
  • n=3n=3: n2n=93=6n^2-n = 9-3 = 6 constraints on the nine functions (Qij)(Q_{ij}), expressing the fact that Q(x)Q(x) lies in a fixed three-dimensional Cartan subalgebra.

This count is generic: it assumes a full decoupling into nn one-dimensional modes (no persistent degeneracy of eigenvalues). If there are exact degeneracies, the effective conditions may be weaker because one can decouple only into blocks corresponding to the degenerate eigenspaces.


  • A system of nn coupled linear second-order ODEs

    y(x)+Q(x)y(x)=0\mathbf y''(x) + Q(x)\,\mathbf y(x) = 0

    can be decoupled by constant linear combinations of the unknowns if and only if there exists a constant invertible matrix SS such that

    S1Q(x)Sis diagonal for all x.S^{-1}Q(x)S \quad\text{is diagonal for all }x.
  • Equivalently, there exists an xx-independent eigenbasis v1,,vnv_1,\dots,v_n and functions λi(x)\lambda_i(x) such that Q(x)vi=λi(x)viQ(x)v_i = \lambda_i(x)v_i for all xx.

  • In the two-equation case, this condition can be written explicitly as the two ratio conditions

    r1(x)q2(x)=const,q1(x)r2(x)q2(x)=const\frac{r_1(x)}{q_2(x)} = \text{const},\qquad \frac{q_1(x)-r_2(x)}{q_2(x)} = \text{const}

    (up to symmetric special cases). These ensure the existence of two constant linear combinations u1,u2u_1,u_2 that obey decoupled scalar equations.

  • In the general nn-equation case, the decoupling condition can be expressed invariantly as

    Q(x)=i=1nλi(x)Pi,Q(x) = \sum_{i=1}^n \lambda_i(x)\,P_i,

    where the projectors PiP_i are constant and sum to the identity. In Lie-algebra terms, Q(x)Q(x) must lie in a fixed Cartan subalgebra (conjugate to the diagonal matrices) for all xx.

  • Generically, there are

    n2nn^2 - n

    independent functional conditions on the entries Qij(x)Q_{ij}(x) for full decoupling into nn scalar second-order equations.

From the point of view of applications, this is exactly the familiar problem of finding normal modes: decoupling is possible precisely when the “shape” of the coupling between components does not change with xx—only the eigenvalues of the coupling matrix are allowed to vary, while its eigenvectors remain fixed.