Differential equations

29 October, 2021 (Friday)

 Last modified: 03:03 AM - 1 November, 2021

PDF version

Frenet-Serret equations express the derivatives of a certain set of functions in terms of themselves. They are an example of ``solutions of a system of linear differential equations’’. This shift in viewpoint from the functions themselves to the kind of relationships their derivatives satisfy, proves, thanks to a theorem from the theory of differential equations, to be surprisingly useful.

First let us clarify what we mean by ``a system of first order linear differential equations’’. If we fix some functions \(a_{ij}(t)\) (called the coefficient functions), we may ask, do there exist some functions \(y_1(t), y_2(t),\ldots y_n(t)\) whose derivatives satisfy the following relationships,

\[\begin{align*} y_1'(t) &= a_{11}(t)y_1(t) + a_{12}(t) y_2(t) + \cdots + a_{1n}y_n(t)\\ y_2'(t) &= a_{21}(t)y_1(t) + a_{22}(t) y_2(t) + \cdots + a_{2n}y_n(t)\\ & \cdots \\ y_m'(t) &= a_{m1}(t)y_1(t) + a_{m2}(t) y_2(t) + \cdots + a_{mn}y_n(t) \end{align*}\]

The above equations involving as yet unknown functions \(y_i(t)\)s and their derivatives, is called a system of first order linear differential equations. Since we will only be dealing with this kind of a system in this article we will simply call it a system of differential equations. If we can find such functions \(y_i(t)\) that do satisfy those equations, we will call them solutions of that system of differential equations.

Here is an example of a differential equation involving just one function and whose coefficient function is constant, \[y'(t) = 5y(t)\] It is easy to check that \(y(t) = e^{5t}\) is a solution to it.

Here is another example,

\[y_1'(t) = \cos(t) y_1(t) + \sin(t)y_2(t)\] \[y_2'(t) = \tan(t) y_1(t) + \sec(t)y_2(t)\] and we may ask if there exists two functions \(y_1(t)\) and \(y_2(t)\) that will satisfy both equations simultaneously.

The theorem we care about asserts that differential equations of that type always have a solution.

There can be many solutions to the same system of differential equations. But if we fix the output of each \(y_i\) for some \(t_0\), i.e. \(y_i(t_0) = c_0\) for some \(c_0\), then there is a unique solution. The specification of the differential equations that the solutions \(y_i\) must satisfy along with their specified values at some so called initial point \(t_0\), is called an initial value problem. So the theorem states that an initial value problem of the above kind always has a solution and that solution is unique.

Note: The above theorem also holds if \(y_i\) are vector valued functions.

Applications

Obtaining a curve with specified curvature and torsion functions

From this point of view, \(\mathbf{T}(t)\), \(\mathbf{N}(t)\) and \(\mathbf{B}(t)\) are solutions of this system of differential equations,

\[\begin{align*} y_1'(t) &= & \kappa(t) y_2(t) & \\ y_2'(t) &= -\kappa(t)y_1(t) & + & \tau(t)y_3(t) \\ y_3'(t) &= & -\tau(t)y_2(t) & \end{align*}\]

This will help us answer the questions, Given any two functions, \(\kappa(t)\) and \(\tau(t)\), is there a curve parametrized by a unit speed parametrization \(\gamma(t)\) whose curvature function is \(\kappa(t)\) and torsion function is \(\tau(t)\)? Under what circumstances is it unique?

The general idea, which was worked out during the lecture was this: If there is a \(\gamma(t)\) with curvature \(\kappa(t)\) and torsion \(\tau(t)\), then its \(\mathbf{T}(t)\), \(\mathbf{N}(t)\), and \(\mathbf{B}(t)\) would have to be solutions of those differential equations. So to find a curve with the given curvature and torsion functions, the idea is to work backward: find a solution to the above differential equation first, in the hope that the three solutions \(y_1\), \(y_2\), and \(y_3\) are \(\mathbf{T}(t)\), \(\mathbf{N}(t)\), and \(\mathbf{B}(t)\), of some \(\gamma\). If that hope is true, we have little choice in what \(\gamma\) should be: it has to be an antiderivative of \(\mathbf{T}(t)\). So we now check if our plan has worked by confirming that \(\mathbf{N}\) and \(\mathbf{B}\) of \(\gamma\) are indeed \(y_2\) and \(y_3\). All this was done during a lecture, but the orthonormality of the solutions was left as an exercise. We will now use the uniqueness of a solution of a system of differential equations to tackle that.

Checking orthonormality

We will now prove that if the solutions \(y_1(t)\), \(y_2(t)\), and \(y_3(t)\) of the following differential equations are orthonormal for some \(t_0\), they are orthonormal for all \(t\).

\[\begin{align*} y_1'(t) &= & \kappa(t) y_2(t) & \\ y_2'(t) &= -\kappa(t)y_1(t) & + & \tau(t)y_3(t) \\ y_3'(t) &= & -\tau(t)y_2(t) & \end{align*}\]

Differential equations makes this really easy. You are trying to show a set of equalities. The trick is to show that the left hand sides and the right hand sides are both solutions of some (not the above one!) initial value problem (i.e. system of linear differential equations with a specified initial value). What differential equations are they? First, let us see what are trying to show:

\[y_1 . y_1 = 1\] \[y_1 . y_2 = 0\] \[y_1 . y_3 = 0\] \[y_2 . y_1 = 0\] \[y_2 . y_2 = 1\] \[y_2 . y_3 = 0\] \[y_3 . y_1 = 0\] \[y_3 . y_2 = 0\] \[y_3 . y_3 = 1\]

Let \(f_{ij} = y_i . y_j\). The crucial observation is this: \(f_{ij}\)’s satisfy a a system of linear differential equations! This is possible because product rule and the differential equations that \(y_i\)s satisfy. For instance, product rule tells us that,

\[f_{12}'(t) = (y_i(t). y_2(t))' = y_1'(t) . y_2(t) + y_1(t) . y_2'(t) \]

but then the differntial equations themselves help us to write the derivatives in terms of the \(y_i\)s.

\[\begin{align*} f_{12}'(t) &= (\kappa(t)y_2(t)) . y_2(t) + y_1(t) . (-\kappa(t)y_1(t) + \tau(t)y_3(t))\\ &= \kappa(t)(y_2(t) . y_2(t)) -\kappa(t)(y_1(t) . y_1(t)) + \tau(t)(y_1(t) . y_3(t)))\\ &= \kappa(t)f_{22} -\kappa(t)f_{11} + \tau(t)f_{13}\\ \end{align*}\] Observe that we have shown that the derivative of \(f_{12}\) can be written in terms of the \(f_{ij}\)s, and we can do the same for the derivatives of the other \(f_{ij}\)s too. We have shown that the \(f_{ij}\)s satisfy a linear differential equation. Also, since at \(t_0\) the \(y_i\)s are given to be orthonormal, we know that \(f_{ij}(t_0) = 0\) if \(i \neq j\) but 1 if \(i=j\). We have to show this for all \(t\). Let us define the constant functions, \(\delta_{ij} = 0\) for \(i \neq j\) and 1 if \(i = j\). Notice that,

\[\begin{align*} \kappa(t)\delta_{22} -\kappa(t)\delta_{11} + \tau(t)\delta_{13} &= \kappa(t)\delta_{22} -\kappa(t)\delta_{11} + \tau(t)\delta_{13}\\ &= \kappa(t)1 -\kappa(t)1 + \tau(t)0\\ &= 0 \end{align*}\] and since \(\delta_{12}\) is a constant function, \(\delta_{12}'(t)=0\), so, \(\delta_{12}'(t)= 0 = \kappa(t)\delta_{22} -\kappa(t)\delta_{11} + \tau(t)\delta_{13}\). In other words, \(\delta_{12}\) satisfies the same differential equation as \(f_{12}\).

The same is true of the other \(\delta_{ij}\)s and so they satisfy the same system of differential equations as \(f_{ij}\)s. In fact, \(\delta_{ij}\)s were chosen to satisfy the same initial condition, i.e. the same values at \(t_0\) as \(f_{ij}\). By the theory of differential equations, since \(\delta_{ij}\)s and \(f_{ij}\)s are solutions of the same initial value problem they are equal and so,

Whenever \(i\neq j\), \[y(t)_i . y_j(t) = f_{ij}(t) = \delta_{ij}(t) = 0\] and \[y(t)_i . y_i(t) = f_{ii}(t) = \delta_{ii}(t) = 1\]

Latest exercise sheet

All exercise sheets…

Recent slides

All slides…

Recent comments

All comments…