optimg: General-Purpose Gradient-Based Optimization

Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <doi:10.48550/arXiv.1609.04747>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.

Version: 0.1.2
Imports: ucminf (≥ 1.1-4)
Published: 2021-10-07
Author: Vithor Rosa Franco
Maintainer: Vithor Rosa Franco <vithorfranco at gmail.com>
License: GPL-3
URL: https://github.com/vthorrf/optimg
NeedsCompilation: no
CRAN checks: optimg results

Documentation:

Reference manual: optimg.pdf

Downloads:

Package source: optimg_0.1.2.tar.gz
Windows binaries: r-prerel: optimg_0.1.2.zip, r-release: optimg_0.1.2.zip, r-oldrel: optimg_0.1.2.zip
macOS binaries: r-prerel (arm64): optimg_0.1.2.tgz, r-release (arm64): optimg_0.1.2.tgz, r-oldrel (arm64): optimg_0.1.2.tgz, r-prerel (x86_64): optimg_0.1.2.tgz, r-release (x86_64): optimg_0.1.2.tgz

Linking:

Please use the canonical form https://CRAN.R-project.org/package=optimg to link to this page.