numopt-js
    Preparing search index...

    Function adjointGradientDescent

    • Performs adjoint gradient descent optimization to minimize a constrained cost function.

      Algorithm:

      1. Start with initial parameters p0 and states x0 (satisfying c(p0, x0) = 0)
      2. Compute partial derivatives ∂f/∂p, ∂f/∂x, ∂c/∂p, ∂c/∂x
      3. Solve adjoint equation: (∂c/∂x)^T λ = (∂f/∂x)^T
      4. Compute gradient: df/dp = ∂f/∂p - λ^T ∂c/∂p
      5. Update parameters: p_new = p_old - stepSize * df/dp
      6. Update states: x_new = x_old - (∂c/∂x)^-1 ∂c/∂p · Δp (linear approximation)
      7. Repeat until convergence or max iterations

      Supports both cost functions f(p,x) and residual functions r(p,x) where f = 1/2 r^T r.

      Parameters

      Returns AdjointGradientDescentResult

      Optimization result with final parameters, states, and constraint norm