Nonlinear optimization with constraints
Augmented Lagrangian Adaptive Barrier Minimization Algorithm for optimizing smooth nonlinear objective functions with constraints. Linear or nonlinear equality and inequality constraints are allowed.
constrOptim.nl(par, fn, gr = NULL, hin = NULL, hin.jac = NULL, heq = NULL, heq.jac = NULL, control.outer=list(), control.optim = list(), ...)
par |
starting vector of parameter values; initial vector must be "feasible" |
fn |
Nonlinear objective function that is to be optimized. A scalar function that takes a real vector as argument and returns a scalar that is the value of the function at that point (see details). |
gr |
The gradient of the objective function |
hin |
a vector function specifying inequality constraints such that hin[j] > 0 for all j |
hin.jac |
Jacobian of |
heq |
a vector function specifying equality constraints such that heq[j] = 0 for all j |
heq.jac |
Jacobian of |
control.outer |
A list of control parameters to be used by the outer loop in |
control.optim |
A list of control parameters to be used by the unconstrained optimization algorithm in the inner loop. Identical to that used in |
... |
Additional arguments passed to |
Argument control.outer
is a list specifing any changes to default values of algorithm control parameters for the outer loop. Note that the names of these must be specified completely. Partial matching will not work. The list items are as follows:
mu0
: A scaling parameter for barrier penalty for inequality constraints.
sig0
: A scaling parameter for augmented lagrangian for equality constraints
eps
: Tolerance for convergence of outer iterations of the barrier and/or augmented lagrangian algorithm
itmax
: Maximum number of outer iterations.
trace
: A logical variable indicating whether information on outer iterations should be printed out. If TRUE, at each outer iteration information is displayed on: (i) how well the inequality and equalities are satisfied, (ii) current parameter values, and (iii) current objective function value.
method
: Unconstrained optimization algorithm in optim() to be used; default is the "BFGS" variable metric method.
NMinit
: A logical variable indicating whether "Nelder-Mead" algorithm should be used in optim() for the first outer iteration.
A list with the following components:
par |
Parameters that optimize the nonlinear objective function, satisfying constraints, if convergence is successful. |
value |
The value of the objective function at termination. |
convergence |
An integer code indicating type of convergence. |
message |
Text message indicating the type of convergence or failure. |
outer.iterations |
Number of outer iterations |
lambda |
Value of augmented Lagrangian penalty parameter |
sigma |
Value of augmented Lagrangian penalty parameter for the quadratic term |
barrier.value |
Reduction in the value of the function from its initial value. This is negative in maximization. |
K |
Residual norm of equality constraints. Must be small at convergence. |
counts |
A vector of length 2 denoting the number of times the objective |
Ravi Varadhan, Center on Aging and Health, Johns Hopkins University.
Lange K, Optimization, 2004, Springer.
Madsen K, Nielsen HB, Tingleff O, Optimization With Constraints, 2004, IMM, Technical University of Denmark.
See Also auglag
, constrOptim
.
fn <- function(x) (x[1] + 3*x[2] + x[3])^2 + 4 * (x[1] - x[2])^2 gr <- function(x) { g <- rep(NA, 3) g[1] <- 2*(x[1] + 3*x[2] + x[3]) + 8*(x[1] - x[2]) g[2] <- 6*(x[1] + 3*x[2] + x[3]) - 8*(x[1] - x[2]) g[3] <- 2*(x[1] + 3*x[2] + x[3]) g } heq <- function(x) { h <- rep(NA, 1) h[1] <- x[1] + x[2] + x[3] - 1 h } heq.jac <- function(x) { j <- matrix(NA, 1, length(x)) j[1, ] <- c(1, 1, 1) j } hin <- function(x) { h <- rep(NA, 1) h[1] <- 6*x[2] + 4*x[3] - x[1]^3 - 3 h[2] <- x[1] h[3] <- x[2] h[4] <- x[3] h } hin.jac <- function(x) { j <- matrix(NA, 4, length(x)) j[1, ] <- c(-3*x[1]^2, 6, 4) j[2, ] <- c(1, 0, 0) j[3, ] <- c(0, 1, 0) j[4, ] <- c(0, 0, 1) j } set.seed(12) p0 <- runif(3) ans <- constrOptim.nl(par=p0, fn=fn, gr=gr, heq=heq, heq.jac=heq.jac, hin=hin, hin.jac=hin.jac) # Not specifying the gradient and the Jacobians set.seed(12) p0 <- runif(3) ans2 <- constrOptim.nl(par=p0, fn=fn, heq=heq, hin=hin)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.