API
As stated in the Home page, we consider the nonlinear optimization problem in the following format:
\[\begin{aligned} \min \quad & f(x) \\ & c_L \leq c(x) \leq c_U \\ & \ell \leq x \leq u. \end{aligned}\]
To develop an optimization algorithm, we are usually worried not only with $f(x)$ and $c(x)$, but also with their derivatives. Namely,
- $\nabla f(x)$, the gradient of $f$ at the point $x$;
- $\nabla^2 f(x)$, the Hessian of $f$ at the point $x$;
- $J(x) = \nabla c(x)^T$, the Jacobian of $c$ at the point $x$;
- $\nabla^2 f(x) + \sum_{i=1}^m \lambda_i \nabla^2 c_i(x)$, the Hessian of the Lagrangian function at the point $(x,\lambda)$.
There are many ways to access some of these values, so here is a little reference guide.
Reference guide
The following naming should be easy enough to follow. If not, click on the link and go to the description.
!
means inplace;_coord
means coordinate format;prod
means matrix-vector product;_op
means operator (as in LinearOperators.jl);_lin
and_nln
respectively refer to linear and nonlinear constraints.
Feel free to open an issue to suggest other methods that should apply to all NLPModels instances.
Function | NLPModels function |
---|---|
$f(x)$ | obj , objgrad , objgrad! , objcons , objcons! |
$\nabla f(x)$ | grad , grad! , objgrad , objgrad! |
$\nabla^2 f(x)$ | hess , hess_op , hess_op! , hess_coord , hess_coord , hess_structure , hess_structure! , hprod , hprod! |
$c(x)$ | cons_lin , cons_lin! , cons_nln , cons_nln! , cons , cons! , objcons , objcons! |
$J(x)$ | jac_lin , jac_nln , jac , jac_lin_op , jac_lin_op! , jac_nln_op , jac_nln_op! ,jac_op , jac_op! , jac_lin_coord , jac_lin_coord! , jac_nln_coord , jac_nln_coord! , jac_coord , jac_coord! , jac_lin_structure , jac_lin_structure! , jac_nln_structure , jac_nln_structure! , jac_structure , jprod_lin , jprod_lin! , jprod_nln , jprod_nln! , jprod , jprod! , jtprod_lin , jtprod_lin! , jtprod_nln , jtprod_nln! , jtprod , jtprod! |
$\nabla^2 L(x,y)$ | hess , hess_op , hess_coord , hess_coord! , hess_structure , hess_structure! , hprod , hprod! , jth_hprod , jth_hprod! , jth_hess , jth_hess_coord , jth_hess_coord! , ghjvprod , ghjvprod! |
API for NLSModels
For the Nonlinear Least Squares models, $f(x) = \tfrac{1}{2} \Vert F(x)\Vert^2$, and these models have additional function to access the residual value and its derivatives. Namely,
- $J_F(x) = \nabla F(x)^T$
- $\nabla^2 F_i(x)$
AbstractNLPModel functions
NLPModels.obj
— Functionf = obj(nlp, x)
Evaluate $f(x)$, the objective function of nlp
at x
.
NLPModels.grad
— Functiong = grad(nlp, x)
Evaluate $∇f(x)$, the gradient of the objective function at x
.
NLPModels.grad!
— Functiong = grad!(nlp, x, g)
Evaluate $∇f(x)$, the gradient of the objective function at x
in place.
NLPModels.objgrad
— Functionf, g = objgrad(nlp, x)
Evaluate $f(x)$ and $∇f(x)$ at x
.
NLPModels.objgrad!
— Functionf, g = objgrad!(nlp, x, g)
Evaluate $f(x)$ and $∇f(x)$ at x
. g
is overwritten with the value of $∇f(x)$.
f, g = objgrad!(nls, x, g)
f, g = objgrad!(nls, x, g, Fx)
Evaluate f(x) and ∇f(x) of nls::AbstractNLSModel
at x
. Fx
is overwritten with the value of the residual F(x)
.
NLPModels.cons
— Functionc = cons(nlp, x)
Evaluate $c(x)$, the constraints at x
.
NLPModels.cons!
— Functionc = cons!(nlp, x, c)
Evaluate $c(x)$, the constraints at x
in place.
NLPModels.cons_lin
— Functionc = cons_lin(nlp, x)
Evaluate the linear constraints at x
.
NLPModels.cons_lin!
— Functionc = cons_lin!(nlp, x, c)
Evaluate the linear constraints at x
in place.
NLPModels.cons_nln
— Functionc = cons_nln(nlp, x)
Evaluate the nonlinear constraints at x
.
NLPModels.cons_nln!
— Functionc = cons_nln!(nlp, x, c)
Evaluate the nonlinear constraints at x
in place.
NLPModels.objcons
— Functionf, c = objcons(nlp, x)
Evaluate $f(x)$ and $c(x)$ at x
.
NLPModels.objcons!
— Functionf = objcons!(nlp, x, c)
Evaluate $f(x)$ and $c(x)$ at x
. c
is overwritten with the value of $c(x)$.
NLPModels.jac_coord
— Functionvals = jac_coord(nlp, x)
Evaluate $J(x)$, the constraints Jacobian at x
in sparse coordinate format.
NLPModels.jac_coord!
— Functionvals = jac_coord!(nlp, x, vals)
Evaluate $J(x)$, the constraints Jacobian at x
in sparse coordinate format, rewriting vals
.
NLPModels.jac_lin_coord
— Functionvals = jac_lin_coord(nlp, x)
Evaluate $J(x)$, the linear constraints Jacobian at x
in sparse coordinate format.
NLPModels.jac_lin_coord!
— Functionvals = jac_lin_coord!(nlp, x, vals)
Evaluate $J(x)$, the linear constraints Jacobian at x
in sparse coordinate format, overwriting vals
.
NLPModels.jac_nln_coord
— Functionvals = jac_nln_coord(nlp, x)
Evaluate $J(x)$, the nonlinear constraints Jacobian at x
in sparse coordinate format.
NLPModels.jac_nln_coord!
— Functionvals = jac_nln_coord!(nlp, x, vals)
Evaluate $J(x)$, the nonlinear constraints Jacobian at x
in sparse coordinate format, overwriting vals
.
NLPModels.jac_structure
— Function(rows,cols) = jac_structure(nlp)
Return the structure of the constraints Jacobian in sparse coordinate format.
NLPModels.jac_structure!
— Functionjac_structure!(nlp, rows, cols)
Return the structure of the constraints Jacobian in sparse coordinate format in place.
NLPModels.jac_lin_structure
— Function(rows,cols) = jac_lin_structure(nlp)
Return the structure of the linear constraints Jacobian in sparse coordinate format.
NLPModels.jac_lin_structure!
— Functionjac_lin_structure!(nlp, rows, cols)
Return the structure of the linear constraints Jacobian in sparse coordinate format in place.
NLPModels.jac_nln_structure
— Function(rows,cols) = jac_nln_structure(nlp)
Return the structure of the nonlinear constraints Jacobian in sparse coordinate format.
NLPModels.jac_nln_structure!
— Functionjac_nln_structure!(nlp, rows, cols)
Return the structure of the nonlinear constraints Jacobian in sparse coordinate format in place.
NLPModels.jac
— FunctionJx = jac(nlp, x)
Evaluate $J(x)$, the constraints Jacobian at x
as a sparse matrix.
NLPModels.jac_lin
— FunctionJx = jac_lin(nlp, x)
Evaluate $J(x)$, the linear constraints Jacobian at x
as a sparse matrix.
NLPModels.jac_nln
— FunctionJx = jac_nln(nlp, x)
Evaluate $J(x)$, the nonlinear constraints Jacobian at x
as a sparse matrix.
NLPModels.jac_op
— FunctionJ = jac_op(nlp, x)
Return the Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
.
NLPModels.jac_op!
— FunctionJ = jac_op!(nlp, x, Jv, Jtv)
Return the Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
J = jac_op!(nlp, rows, cols, vals, Jv, Jtv)
Return the Jacobian given by (rows, cols, vals)
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
NLPModels.jac_lin_op
— FunctionJ = jac_lin_op(nlp, x)
Return the linear Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
.
NLPModels.jac_lin_op!
— FunctionJ = jac_lin_op!(nlp, x, Jv, Jtv)
Return the linear Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
J = jac_lin_op!(nlp, rows, cols, vals, Jv, Jtv)
Return the linear Jacobian given by (rows, cols, vals)
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
NLPModels.jac_nln_op
— FunctionJ = jac_nln_op(nlp, x)
Return the nonlinear Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
.
NLPModels.jac_nln_op!
— FunctionJ = jac_nln_op!(nlp, x, Jv, Jtv)
Return the nonlinear Jacobian at x
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
J = jac_nln_op!(nlp, rows, cols, vals, Jv, Jtv)
Return the nonlinear Jacobian given by (rows, cols, vals)
as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v
or J' * v
. The values Jv
and Jtv
are used as preallocated storage for the operations.
NLPModels.jprod
— FunctionJv = jprod(nlp, x, v)
Evaluate $J(x)v$, the Jacobian-vector product at x
.
NLPModels.jprod!
— FunctionJv = jprod!(nlp, x, v, Jv)
Evaluate $J(x)v$, the Jacobian-vector product at x
in place.
Jv = jprod!(nlp, rows, cols, vals, v, Jv)
Evaluate $J(x)v$, the Jacobian-vector product, where the Jacobian is given by (rows, cols, vals)
in triplet format.
NLPModels.jprod_lin
— FunctionJv = jprod_lin(nlp, x, v)
Evaluate $J(x)v$, the linear Jacobian-vector product at x
.
NLPModels.jprod_lin!
— FunctionJv = jprod_lin!(nlp, x, v, Jv)
Evaluate $J(x)v$, the linear Jacobian-vector product at x
in place.
NLPModels.jprod_nln
— FunctionJv = jprod_nln(nlp, x, v)
Evaluate $J(x)v$, the nonlinear Jacobian-vector product at x
.
NLPModels.jprod_nln!
— FunctionJv = jprod_nln!(nlp, x, v, Jv)
Evaluate $J(x)v$, the nonlinear Jacobian-vector product at x
in place.
NLPModels.jtprod
— FunctionJtv = jtprod(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x
.
NLPModels.jtprod!
— FunctionJtv = jtprod!(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x
in place. If the problem has linear and nonlinear constraints, this function allocates.
Jtv = jtprod!(nlp, rows, cols, vals, v, Jtv)
Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product, where the Jacobian is given by (rows, cols, vals)
in triplet format.
NLPModels.jtprod_lin
— FunctionJtv = jtprod_lin(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product at x
.
NLPModels.jtprod_lin!
— FunctionJtv = jtprod_lin!(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product at x
in place.
NLPModels.jtprod_nln
— FunctionJtv = jtprod_nln(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x
.
NLPModels.jtprod_nln!
— FunctionJtv = jtprod_nln!(nlp, x, v, Jtv)
Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x
in place.
NLPModels.jth_hprod
— FunctionHv = jth_hprod(nlp, x, v, j)
Evaluate the product of the Hessian of j-th constraint at x
with the vector v
.
NLPModels.jth_hprod!
— FunctionHv = jth_hprod!(nlp, x, v, j, Hv)
Evaluate the product of the Hessian of j-th constraint at x
with the vector v
in place.
NLPModels.jth_hess
— FunctionHx = jth_hess(nlp, x, j)
Evaluate the Hessian of j-th constraint at x
as a sparse matrix with the same sparsity pattern as the Lagrangian Hessian. A Symmetric
object wrapping the lower triangle is returned.
NLPModels.jth_hess_coord
— Functionvals = jth_hess_coord(nlp, x, j)
Evaluate the Hessian of j-th constraint at x
in sparse coordinate format. Only the lower triangle is returned.
NLPModels.jth_hess_coord!
— Functionvals = jth_hess_coord!(nlp, x, j, vals)
Evaluate the Hessian of j-th constraint at x
in sparse coordinate format, with vals
of length nlp.meta.nnzh
, in place. Only the lower triangle is returned.
NLPModels.ghjvprod
— FunctiongHv = ghjvprod(nlp, x, g, v)
Return the vector whose i-th component is gᵀ ∇²cᵢ(x) v.
NLPModels.ghjvprod!
— Functionghjvprod!(nlp, x, g, v, gHv)
Return the vector whose i-th component is gᵀ ∇²cᵢ(x) v in place.
NLPModels.hess_coord
— Functionvals = hess_coord(nlp, x; obj_weight=1.0)
Evaluate the objective Hessian at x
in sparse coordinate format, with objective function scaled by obj_weight
, i.e.,
\[σ ∇²f(x),\]
with σ = obj_weight
. Only the lower triangle is returned.
vals = hess_coord(nlp, x, y; obj_weight=1.0)
Evaluate the Lagrangian Hessian at (x,y)
in sparse coordinate format, with objective function scaled by obj_weight
, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
. Only the lower triangle is returned.
NLPModels.hess_coord!
— Functionvals = hess_coord!(nlp, x, y, vals; obj_weight=1.0)
Evaluate the Lagrangian Hessian at (x,y)
in sparse coordinate format, with objective function scaled by obj_weight
, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
, overwriting vals
. Only the lower triangle is returned.
NLPModels.hess_structure
— Function(rows,cols) = hess_structure(nlp)
Return the structure of the Lagrangian Hessian in sparse coordinate format.
NLPModels.hess_structure!
— Functionhess_structure!(nlp, rows, cols)
Return the structure of the Lagrangian Hessian in sparse coordinate format in place.
NLPModels.hess
— FunctionHx = hess(nlp, x; obj_weight=1.0)
Evaluate the objective Hessian at x
as a sparse matrix, with objective function scaled by obj_weight
, i.e.,
\[σ ∇²f(x),\]
with σ = obj_weight
. A Symmetric
object wrapping the lower triangle is returned.
Hx = hess(nlp, x, y; obj_weight=1.0)
Evaluate the Lagrangian Hessian at (x,y)
as a sparse matrix, with objective function scaled by obj_weight
, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
. A Symmetric
object wrapping the lower triangle is returned.
NLPModels.hess_op
— FunctionH = hess_op(nlp, x; obj_weight=1.0)
Return the objective Hessian at x
with objective function scaled by obj_weight
as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v
. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight
.
H = hess_op(nlp, x, y; obj_weight=1.0)
Return the Lagrangian Hessian at (x,y)
with objective function scaled by obj_weight
as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v
. The linear operator H represents
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
.
NLPModels.hess_op!
— FunctionH = hess_op!(nlp, x, Hv; obj_weight=1.0)
Return the objective Hessian at x
with objective function scaled by obj_weight
as a linear operator, and storing the result on Hv
. The resulting object may be used as if it were a matrix, e.g., w = H * v
. The vector Hv
is used as preallocated storage for the operation. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight
.
H = hess_op!(nlp, rows, cols, vals, Hv)
Return the Hessian given by (rows, cols, vals)
as a linear operator, and storing the result on Hv
. The resulting object may be used as if it were a matrix, e.g., w = H * v
. The vector Hv
is used as preallocated storage for the operation. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight
.
H = hess_op!(nlp, x, y, Hv; obj_weight=1.0)
Return the Lagrangian Hessian at (x,y)
with objective function scaled by obj_weight
as a linear operator, and storing the result on Hv
. The resulting object may be used as if it were a matrix, e.g., w = H * v
. The vector Hv
is used as preallocated storage for the operation. The linear operator H represents
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
.
NLPModels.hprod
— FunctionHv = hprod(nlp, x, v; obj_weight=1.0)
Evaluate the product of the objective Hessian at x
with the vector v
, with objective function scaled by obj_weight
, where the objective Hessian is
\[σ ∇²f(x),\]
with σ = obj_weight
.
Hv = hprod(nlp, x, y, v; obj_weight=1.0)
Evaluate the product of the Lagrangian Hessian at (x,y)
with the vector v
, with objective function scaled by obj_weight
, where the Lagrangian Hessian is
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
.
NLPModels.hprod!
— FunctionHv = hprod!(nlp, x, y, v, Hv; obj_weight=1.0)
Evaluate the product of the Lagrangian Hessian at (x,y)
with the vector v
in place, with objective function scaled by obj_weight
, where the Lagrangian Hessian is
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight
.
LinearOperators.reset!
— Functionreset!(counters)
Reset evaluation counters
reset!(nlp)
Reset evaluation count in nlp
NLPModels.reset_data!
— Functionreset_data!(nlp)
Reset model data if appropriate. This method should be overloaded if a subtype of AbstractNLPModel
contains data that should be reset, such as a quasi-Newton linear operator.
AbstractNLSModel
NLPModels.NLSCounters
— TypeNLSCounters
Struct for storing the number of functions evaluations for nonlinear least-squares models. NLSCounters also stores a Counters
instance named counters
.
NLSCounters()
Creates an empty NLSCounters struct.
NLPModels.residual
— FunctionFx = residual(nls, x)
Computes $F(x)$, the residual at x.
NLPModels.residual!
— FunctionFx = residual!(nls, x, Fx)
Computes $F(x)$, the residual at x.
NLPModels.jac_residual
— FunctionJx = jac_residual(nls, x)
Computes $J(x)$, the Jacobian of the residual at x.
NLPModels.jac_coord_residual
— Function(rows,cols,vals) = jac_coord_residual(nls, x)
Computes the Jacobian of the residual at x
in sparse coordinate format.
NLPModels.jac_coord_residual!
— Functionvals = jac_coord_residual!(nls, x, vals)
Computes the Jacobian of the residual at x
in sparse coordinate format, rewriting vals
. rows
and cols
are not rewritten.
NLPModels.jac_structure_residual
— Function(rows,cols) = jac_structure_residual(nls)
Returns the structure of the constraint's Jacobian in sparse coordinate format.
NLPModels.jac_structure_residual!
— Function(rows,cols) = jac_structure_residual!(nls, rows, cols)
Returns the structure of the constraint's Jacobian in sparse coordinate format in place.
NLPModels.jprod_residual
— FunctionJv = jprod_residual(nls, x, v)
Computes the product of the Jacobian of the residual at x and a vector, i.e., $J(x)v$.
NLPModels.jprod_residual!
— FunctionJv = jprod_residual!(nls, x, v, Jv)
Computes the product of the Jacobian of the residual at x and a vector, i.e., $J(x)v$, storing it in Jv
.
NLPModels.jtprod_residual
— FunctionJtv = jtprod_residual(nls, x, v)
Computes the product of the transpose of the Jacobian of the residual at x and a vector, i.e., $J(x)^Tv$.
NLPModels.jtprod_residual!
— FunctionJtv = jtprod_residual!(nls, x, v, Jtv)
Computes the product of the transpose of the Jacobian of the residual at x and a vector, i.e., $J(x)^Tv$, storing it in Jtv
.
NLPModels.jac_op_residual
— FunctionJx = jac_op_residual(nls, x)
Computes $J(x)$, the Jacobian of the residual at x, in linear operator form.
NLPModels.jac_op_residual!
— FunctionJx = jac_op_residual!(nls, x, Jv, Jtv)
Computes $J(x)$, the Jacobian of the residual at x, in linear operator form. The vectors Jv
and Jtv
are used as preallocated storage for the operations.
Jx = jac_op_residual!(nls, rows, cols, vals, Jv, Jtv)
Computes $J(x)$, the Jacobian of the residual given by (rows, cols, vals)
, in linear operator form. The vectors Jv
and Jtv
are used as preallocated storage for the operations.
NLPModels.hess_residual
— FunctionH = hess_residual(nls, x, v)
Computes the linear combination of the Hessians of the residuals at x
with coefficients v
. A Symmetric
object wrapping the lower triangle is returned.
NLPModels.hess_coord_residual
— Functionvals = hess_coord_residual(nls, x, v)
Computes the linear combination of the Hessians of the residuals at x
with coefficients v
in sparse coordinate format.
NLPModels.hess_coord_residual!
— Functionvals = hess_coord_residual!(nls, x, v, vals)
Computes the linear combination of the Hessians of the residuals at x
with coefficients v
in sparse coordinate format, rewriting vals
.
NLPModels.hess_structure_residual
— Function(rows,cols) = hess_structure_residual(nls)
Returns the structure of the residual Hessian.
NLPModels.hess_structure_residual!
— Functionhess_structure_residual!(nls, rows, cols)
Returns the structure of the residual Hessian in place.
NLPModels.jth_hess_residual
— FunctionHj = jth_hess_residual(nls, x, j)
Computes the Hessian of the j-th residual at x.
NLPModels.hprod_residual
— FunctionHiv = hprod_residual(nls, x, i, v)
Computes the product of the Hessian of the i-th residual at x, times the vector v.
NLPModels.hprod_residual!
— FunctionHiv = hprod_residual!(nls, x, i, v, Hiv)
Computes the product of the Hessian of the i-th residual at x, times the vector v, and stores it in vector Hiv.
NLPModels.hess_op_residual
— FunctionHop = hess_op_residual(nls, x, i)
Computes the Hessian of the i-th residual at x, in linear operator form.
NLPModels.hess_op_residual!
— FunctionHop = hess_op_residual!(nls, x, i, Hiv)
Computes the Hessian of the i-th residual at x, in linear operator form. The vector Hiv
is used as preallocated storage for the operation.