# API

As stated in the Home page, we consider the nonlinear optimization problem in the following format:

\begin{aligned} \min \quad & f(x) \\ & c_L \leq c(x) \leq c_U \\ & \ell \leq x \leq u. \end{aligned}

To develop an optimization algorithm, we are usually worried not only with $f(x)$ and $c(x)$, but also with their derivatives. Namely,

• $\nabla f(x)$, the gradient of $f$ at the point $x$;
• $\nabla^2 f(x)$, the Hessian of $f$ at the point $x$;
• $J(x) = \nabla c(x)^T$, the Jacobian of $c$ at the point $x$;
• $\nabla^2 f(x) + \sum_{i=1}^m \lambda_i \nabla^2 c_i(x)$, the Hessian of the Lagrangian function at the point $(x,\lambda)$.

There are many ways to access some of these values, so here is a little reference guide.

## Reference guide

The following naming should be easy enough to follow. If not, click on the link and go to the description.

• ! means inplace;
• _coord means coordinate format;
• prod means matrix-vector product;
• _op means operator (as in LinearOperators.jl);
• _lin and _nln respectively refer to linear and nonlinear constraints.

Feel free to open an issue to suggest other methods that should apply to all NLPModels instances.

FunctionNLPModels function
$f(x)$obj, objgrad, objgrad!, objcons, objcons!
$\nabla f(x)$grad, grad!, objgrad, objgrad!
$\nabla^2 f(x)$hess, hess_op, hess_op!, hess_coord, hess_coord, hess_structure, hess_structure!, hprod, hprod!
$c(x)$cons_lin, cons_lin!, cons_nln, cons_nln!, cons, cons!, objcons, objcons!
$J(x)$jac_lin, jac_nln, jac, jac_lin_op, jac_lin_op!, jac_nln_op, jac_nln_op!,jac_op, jac_op!, jac_lin_coord, jac_lin_coord!, jac_nln_coord, jac_nln_coord!, jac_coord, jac_coord!, jac_lin_structure, jac_lin_structure!, jac_nln_structure, jac_nln_structure!, jac_structure, jprod_lin, jprod_lin!, jprod_nln, jprod_nln!, jprod, jprod!, jtprod_lin, jtprod_lin!, jtprod_nln, jtprod_nln!, jtprod, jtprod!
$\nabla^2 L(x,y)$hess, hess_op, hess_coord, hess_coord!, hess_structure, hess_structure!, hprod, hprod!, jth_hprod, jth_hprod!, jth_hess, jth_hess_coord, jth_hess_coord!, ghjvprod, ghjvprod!

## API for NLSModels

For the Nonlinear Least Squares models, $f(x) = \tfrac{1}{2} \Vert F(x)\Vert^2$, and these models have additional function to access the residual value and its derivatives. Namely,

• $J_F(x) = \nabla F(x)^T$
• $\nabla^2 F_i(x)$
Functionfunction
$F(x)$residual, residual!
$J_F(x)$jac_residual, jac_coord_residual, jac_coord_residual!, jac_structure_residual, jprod_residual, jprod_residual!, jtprod_residual, jtprod_residual!, jac_op_residual, jac_op_residual!
$\nabla^2 F_i(x)$hess_residual, hess_coord_residual, hess_coord_residual!, hess_structure_residual, hess_structure_residual!, jth_hess_residual, hprod_residual, hprod_residual!, hess_op_residual, hess_op_residual!

## AbstractNLPModel functions

NLPModels.objFunction
f = obj(nlp, x)

Evaluate $f(x)$, the objective function of nlp at x.

source
NLPModels.grad!Function
g = grad!(nlp, x, g)

Evaluate $∇f(x)$, the gradient of the objective function at x in place.

source
NLPModels.objgrad!Function
f, g = objgrad!(nlp, x, g)

Evaluate $f(x)$ and $∇f(x)$ at x. g is overwritten with the value of $∇f(x)$.

source
f, g = objgrad!(nls, x, g)
f, g = objgrad!(nls, x, g, Fx)

Evaluate f(x) and ∇f(x) of nls::AbstractNLSModel at x. Fx is overwritten with the value of the residual F(x).

source
NLPModels.objcons!Function
f = objcons!(nlp, x, c)

Evaluate $f(x)$ and $c(x)$ at x. c is overwritten with the value of $c(x)$.

source
NLPModels.jac_coord!Function
vals = jac_coord!(nlp, x, vals)

Evaluate $J(x)$, the constraints Jacobian at x in sparse coordinate format, rewriting vals.

source
NLPModels.jac_lin_coord!Function
vals = jac_lin_coord!(nlp, x, vals)

Evaluate $J(x)$, the linear constraints Jacobian at x in sparse coordinate format, overwriting vals.

source
NLPModels.jac_nln_coord!Function
vals = jac_nln_coord!(nlp, x, vals)

Evaluate $J(x)$, the nonlinear constraints Jacobian at x in sparse coordinate format, overwriting vals.

source
NLPModels.jacFunction
Jx = jac(nlp, x)

Evaluate $J(x)$, the constraints Jacobian at x as a sparse matrix.

source
NLPModels.jac_linFunction
Jx = jac_lin(nlp, x)

Evaluate $J(x)$, the linear constraints Jacobian at x as a sparse matrix.

source
NLPModels.jac_nlnFunction
Jx = jac_nln(nlp, x)

Evaluate $J(x)$, the nonlinear constraints Jacobian at x as a sparse matrix.

source
NLPModels.jac_opFunction
J = jac_op(nlp, x)

Return the Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.

source
NLPModels.jac_op!Function
J = jac_op!(nlp, x, Jv, Jtv)

Return the Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
J = jac_op!(nlp, rows, cols, vals, Jv, Jtv)

Return the Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
NLPModels.jac_lin_opFunction
J = jac_lin_op(nlp, x)

Return the linear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.

source
NLPModels.jac_lin_op!Function
J = jac_lin_op!(nlp, x, Jv, Jtv)

Return the linear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
J = jac_lin_op!(nlp, rows, cols, vals, Jv, Jtv)

Return the linear Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
NLPModels.jac_nln_opFunction
J = jac_nln_op(nlp, x)

Return the nonlinear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.

source
NLPModels.jac_nln_op!Function
J = jac_nln_op!(nlp, x, Jv, Jtv)

Return the nonlinear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
J = jac_nln_op!(nlp, rows, cols, vals, Jv, Jtv)

Return the nonlinear Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.

source
NLPModels.jprod!Function
Jv = jprod!(nlp, x, v, Jv)

Evaluate $J(x)v$, the Jacobian-vector product at x in place.

source
Jv = jprod!(nlp, rows, cols, vals, v, Jv)

Evaluate $J(x)v$, the Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.

source
NLPModels.jtprodFunction
Jtv = jtprod(nlp, x, v, Jtv)

Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x.

source
NLPModels.jtprod!Function
Jtv = jtprod!(nlp, x, v, Jtv)

Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x in place. If the problem has linear and nonlinear constraints, this function allocates.

source
Jtv = jtprod!(nlp, rows, cols, vals, v, Jtv)

Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.

source
NLPModels.jtprod_lin!Function
Jtv = jtprod_lin!(nlp, x, v, Jtv)

Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product at x in place.

source
NLPModels.jtprod_nlnFunction
Jtv = jtprod_nln(nlp, x, v, Jtv)

Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x.

source
NLPModels.jtprod_nln!Function
Jtv = jtprod_nln!(nlp, x, v, Jtv)

Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x in place.

source
NLPModels.jth_hprodFunction
Hv = jth_hprod(nlp, x, v, j)

Evaluate the product of the Hessian of j-th constraint at x with the vector v.

source
NLPModels.jth_hprod!Function
Hv = jth_hprod!(nlp, x, v, j, Hv)

Evaluate the product of the Hessian of j-th constraint at x with the vector v in place.

source
NLPModels.jth_hessFunction

Hx = jth_hess(nlp, x, j)

Evaluate the Hessian of j-th constraint at x as a sparse matrix with the same sparsity pattern as the Lagrangian Hessian. A Symmetric object wrapping the lower triangle is returned.

source
NLPModels.jth_hess_coordFunction
vals = jth_hess_coord(nlp, x, j)

Evaluate the Hessian of j-th constraint at x in sparse coordinate format. Only the lower triangle is returned.

source
NLPModels.jth_hess_coord!Function
vals = jth_hess_coord!(nlp, x, j, vals)

Evaluate the Hessian of j-th constraint at x in sparse coordinate format, with vals of length nlp.meta.nnzh, in place. Only the lower triangle is returned.

source
NLPModels.hess_coordFunction
vals = hess_coord(nlp, x; obj_weight=1.0)

Evaluate the objective Hessian at x in sparse coordinate format, with objective function scaled by obj_weight, i.e.,

$$$σ ∇²f(x),$$$

with σ = obj_weight . Only the lower triangle is returned.

source
vals = hess_coord(nlp, x, y; obj_weight=1.0)

Evaluate the Lagrangian Hessian at (x,y) in sparse coordinate format, with objective function scaled by obj_weight, i.e.,

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight . Only the lower triangle is returned.

source
NLPModels.hess_coord!Function
vals = hess_coord!(nlp, x, y, vals; obj_weight=1.0)

Evaluate the Lagrangian Hessian at (x,y) in sparse coordinate format, with objective function scaled by obj_weight, i.e.,

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight , overwriting vals. Only the lower triangle is returned.

source
NLPModels.hessFunction
Hx = hess(nlp, x; obj_weight=1.0)

Evaluate the objective Hessian at x as a sparse matrix, with objective function scaled by obj_weight, i.e.,

$$$σ ∇²f(x),$$$

with σ = obj_weight . A Symmetric object wrapping the lower triangle is returned.

source
Hx = hess(nlp, x, y; obj_weight=1.0)

Evaluate the Lagrangian Hessian at (x,y) as a sparse matrix, with objective function scaled by obj_weight, i.e.,

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight . A Symmetric object wrapping the lower triangle is returned.

source
NLPModels.hess_opFunction
H = hess_op(nlp, x; obj_weight=1.0)

Return the objective Hessian at x with objective function scaled by obj_weight as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v. The linear operator H represents

$$$σ ∇²f(x),$$$

with σ = obj_weight .

source
H = hess_op(nlp, x, y; obj_weight=1.0)

Return the Lagrangian Hessian at (x,y) with objective function scaled by obj_weight as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v. The linear operator H represents

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight .

source
NLPModels.hess_op!Function
H = hess_op!(nlp, x, Hv; obj_weight=1.0)

Return the objective Hessian at x with objective function scaled by obj_weight as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents

$$$σ ∇²f(x),$$$

with σ = obj_weight .

source
H = hess_op!(nlp, rows, cols, vals, Hv)

Return the Hessian given by (rows, cols, vals) as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents

$$$σ ∇²f(x),$$$

with σ = obj_weight .

source
H = hess_op!(nlp, x, y, Hv; obj_weight=1.0)

Return the Lagrangian Hessian at (x,y) with objective function scaled by obj_weight as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight .

source
NLPModels.hprodFunction
Hv = hprod(nlp, x, v; obj_weight=1.0)

Evaluate the product of the objective Hessian at x with the vector v, with objective function scaled by obj_weight, where the objective Hessian is

$$$σ ∇²f(x),$$$

with σ = obj_weight .

source
Hv = hprod(nlp, x, y, v; obj_weight=1.0)

Evaluate the product of the Lagrangian Hessian at (x,y) with the vector v, with objective function scaled by obj_weight, where the Lagrangian Hessian is

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight .

source
NLPModels.hprod!Function
Hv = hprod!(nlp, x, y, v, Hv; obj_weight=1.0)

Evaluate the product of the Lagrangian Hessian at (x,y) with the vector v in place, with objective function scaled by obj_weight, where the Lagrangian Hessian is

$$$∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),$$$

with σ = obj_weight .

source
NLPModels.reset_data!Function
reset_data!(nlp)

Reset model data if appropriate. This method should be overloaded if a subtype of AbstractNLPModel contains data that should be reset, such as a quasi-Newton linear operator.

source

## AbstractNLSModel

NLPModels.NLSCountersType
NLSCounters

Struct for storing the number of functions evaluations for nonlinear least-squares models. NLSCounters also stores a Counters instance named counters.

NLSCounters()

Creates an empty NLSCounters struct.

source
NLPModels.jac_coord_residual!Function
vals = jac_coord_residual!(nls, x, vals)

Computes the Jacobian of the residual at x in sparse coordinate format, rewriting vals. rows and cols are not rewritten.

source
NLPModels.jprod_residual!Function
Jv = jprod_residual!(nls, x, v, Jv)

Computes the product of the Jacobian of the residual at x and a vector, i.e., $J(x)v$, storing it in Jv.

source
NLPModels.jtprod_residual!Function
Jtv = jtprod_residual!(nls, x, v, Jtv)

Computes the product of the transpose of the Jacobian of the residual at x and a vector, i.e., $J(x)^Tv$, storing it in Jtv.

source
NLPModels.jac_op_residual!Function
Jx = jac_op_residual!(nls, x, Jv, Jtv)

Computes $J(x)$, the Jacobian of the residual at x, in linear operator form. The vectors Jv and Jtv are used as preallocated storage for the operations.

source
Jx = jac_op_residual!(nls, rows, cols, vals, Jv, Jtv)

Computes $J(x)$, the Jacobian of the residual given by (rows, cols, vals), in linear operator form. The vectors Jv and Jtv are used as preallocated storage for the operations.

source
NLPModels.hess_residualFunction
H = hess_residual(nls, x, v)

Computes the linear combination of the Hessians of the residuals at x with coefficients v. A Symmetric object wrapping the lower triangle is returned.

source
NLPModels.hess_coord_residualFunction
vals = hess_coord_residual(nls, x, v)

Computes the linear combination of the Hessians of the residuals at x with coefficients v in sparse coordinate format.

source
NLPModels.hess_coord_residual!Function
vals = hess_coord_residual!(nls, x, v, vals)

Computes the linear combination of the Hessians of the residuals at x with coefficients v in sparse coordinate format, rewriting vals.

source
NLPModels.hprod_residual!Function
Hiv = hprod_residual!(nls, x, i, v, Hiv)

Computes the product of the Hessian of the i-th residual at x, times the vector v, and stores it in vector Hiv.

source
NLPModels.hess_op_residual!Function
Hop = hess_op_residual!(nls, x, i, Hiv)

Computes the Hessian of the i-th residual at x, in linear operator form. The vector Hiv is used as preallocated storage for the operation.

source