Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

dfoptim-package

Derivative-Free Optimization


Description

Derivative-Free optimization algorithms. These algorithms do not require gradient information. More importantly, they can be used to solve non-smooth optimization problems. They can also handle box constraints on parameters.

Details

Package: dfoptim
Type: Package
Version: 2016.7-1
Date: 2016-07-08
License: GPL-2 or greater
LazyLoad: yes

Derivative-Free optimization algorithms. These algorithms do not require gradient information. More importantly, they can be used to solve non-smooth optimization problems. These algorithms were translated from the Matlab code of Prof. C.T. Kelley, given in his book "Iterative methods for optimization". However, there are some non-trivial modifications of the algorithm.

Currently, the Nelder-Mead and Hooke-Jeeves algorithms is implemented. In future, more derivative-free algorithms may be added.

Author(s)

Ravi Varadhan, Johns Hopkins University
URL: http://www.jhsph.edu/agingandhealth/People/Faculty_personal_pages/Varadhan.html
Hans W. Borchers, ABB Corporate Research
Maintainer: Ravi Varadhan <ravi.varadhan@jhu.edu>

References

C.T. Kelley (1999), Iterative Methods for Optimization, SIAM.


dfoptim

Derivative-Free Optimization

v2020.10-1
GPL (>= 2)
Authors
Ravi Varadhan[aut, cre], Johns Hopkins University, Hans W. Borchers[aut], ABB Corporate Research, and Vincent Bechard[aut], HEC Montreal (Montreal University)
Initial release
2020-10-19

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.