Lyapunov Minimization Condition

The condition that the Lyapunov function $V(x)$ must be minimized uniquely at the fixed point $x_0$ is often represented as a requirement for $V(x)$ to be positive away from the fixed point and zero at the fixed point. Put mathematically, it is sufficient to require $V(x) > 0 \, \forall x \ne x_0$ and $V(x_0) = 0$. We call such functions positive definite (around the fixed point $x_0$). The weaker condition that $V(x) \ge 0 \, \forall x \ne x_0$ and $V(x_0) = 0$ is positive semi-definiteness.

NeuralLyapunov.jl provides the LyapunovMinimizationCondition struct for users to specify the form of the minimization condition to enforce through training.

NeuralLyapunov.LyapunovMinimizationConditionType
LyapunovMinimizationCondition(check_nonnegativity, strength, rectifier, check_fixed_point)

Specifies the form of the Lyapunov minimization condition to be used.

Fields

  • check_nonnegativity::Bool: whether or not to train for positivity/nonnegativity of $V(x)$
  • strength::Function: specifies the level of strictness for positivity training; should be zero when the two inputs are equal and nonnegative otherwise; used when check_nonnegativity == true
  • rectifier::Function: positive when the input is positive and (approximately) zero when the input is negative
  • check_fixed_point: whether or not to train for $V(x_0) = 0$.

Training conditions

If check_nonnegativity is true, training will attempt to enforce:

$V(x) ≥ \texttt{strength}(x, x_0).$

The inequality will be approximated by the equation:

$\texttt{rectifier}(\texttt{strength}(x, x_0) - V(x_0)) = 0.$

Note that the approximate equation and inequality are identical when $\texttt{rectifier}(t) = \max(0, t)$.

If check_fixed_point is true, then training will also attempt to enforce $V(x_0) = 0$.

Examples

When training for a strictly positive definite $V$, an example of an appropriate strength is $\texttt{strength}(x, x_0) = \lVert x - x_0 \rVert^2$. This form is used in StrictlyPositiveDefinite.

If $V$ were structured such that it is always nonnegative, then $V(x_0) = 0$ is all that must be enforced in training for the Lyapunov function to be uniquely minimized at $x_0$. In that case, we would use check_nonnegativity = false; check_fixed_point = true. This can also be accomplished with DontCheckNonnegativity(true).

In either case, the rectified linear unit rectifier(t) = max(zero(t), t) exactly represents the inequality, but differentiable approximations of this function may be employed.

source

Pre-defined minimization conditions

NeuralLyapunov.StrictlyPositiveDefiniteFunction
StrictlyPositiveDefinite(; C, check_fixed_point, rectifier)

Construct a LyapunovMinimizationCondition representing $V(x) ≥ C \lVert x - x_0 \rVert^2$. If check_fixed_point == true (as is the default), then training will also attempt to enforce $V(x_0) = 0$.

The inequality is approximated by $\texttt{rectifier}(C \lVert x - x_0 \rVert^2 - V(x)) = 0$, and the default rectifier is the rectified linear unit (t) -> max(zero(t), t), which exactly represents $V(x) ≥ C \lVert x - x_0 \rVert^2$. $C$ defaults to 1e-6.

source
NeuralLyapunov.PositiveSemiDefiniteFunction
PositiveSemiDefinite(; check_fixed_point, rectifier)

Construct a LyapunovMinimizationCondition representing $V(x) ≥ 0$. If check_fixed_point == true (as is the default), then training will also attempt to enforce $V(x_0) = 0$.

The inequality is approximated by $\texttt{rectifier}( -V(x) ) = 0$ and the default rectifier is the rectified linear unit (t) -> max(zero(t), t), which exactly represents $V(x) ≥ 0$.

source
NeuralLyapunov.DontCheckNonnegativityFunction
DontCheckNonnegativity(; check_fixed_point)

Construct a LyapunovMinimizationCondition which represents not checking for nonnegativity of the Lyapunov function. This is appropriate in cases where this condition has been structurally enforced.

Even in this case, it is still possible to check for $V(x_0) = 0$, for example if V is structured to be positive for $x ≠ x_0$ but does not guarantee $V(x_0) = 0$ (such as NonnegativeStructure). check_fixed_point defaults to true, since in cases where $V(x_0) = 0$ is enforced structurally, the equation will reduce to 0.0 ~ 0.0, which gets automatically removed in most cases.

source

Defining your own minimization condition

If a user wishes to define their own version of the minimization condition in a form other than $V(x) ≥ \texttt{strength}(x, x_0)$, they must define their own subtype of AbstractLyapunovMinimizationCondition.

NeuralLyapunov.AbstractLyapunovMinimizationConditionType
AbstractLyapunovMinimizationCondition

Represents the minimization condition in a neural Lyapunov problem

All concrete AbstractLyapunovMinimizationCondition subtypes should define the check_nonnegativity, check_fixed_point, and get_minimization_condition functions.

source

When constructing the PDESystem, NeuralLyapunovPDESystem uses check_nonnegativity to determine if it should include an equation equating the result of get_minimization_condition to zero. It additionally uses check_minimal_fixed_point to determine if it should include the equation $V(x_0) = 0$.

NeuralLyapunov.check_nonnegativityFunction
check_nonnegativity(cond::AbstractLyapunovMinimizationCondition)

Return true if cond specifies training to meet the Lyapunov minimization condition, and false if cond specifies no training to meet this condition.

source
NeuralLyapunov.check_minimal_fixed_pointFunction
check_minimal_fixed_point(cond::AbstractLyapunovMinimizationCondition)

Return true if cond specifies training for the Lyapunov function to equal zero at the fixed point, and false if cond specifies no training to meet this condition.

source
NeuralLyapunov.get_minimization_conditionFunction
get_minimization_condition(cond::AbstractLyapunovMinimizationCondition)

Return a function of $V$, $x$, and $x_0$ that equals zero when the Lyapunov minimization condition is met for the Lyapunov candidate function $V$ at the point $x$, and is greater than zero if it's violated.

Note that the first input, $V$, is a function, so the minimization condition can depend on the value of the candidate Lyapunov function at multiple points.

source