https://github.com/vthorrf/optimg
General-purpose Gradient-based Optimization
https://github.com/vthorrf/optimg
Last synced: 5 months ago
JSON representation
General-purpose Gradient-based Optimization
- Host: GitHub
- URL: https://github.com/vthorrf/optimg
- Owner: vthorrf
- Created: 2021-10-07T00:57:22.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2023-02-02T12:53:44.000Z (about 2 years ago)
- Last Synced: 2024-11-15T18:39:40.060Z (5 months ago)
- Language: R
- Size: 514 KB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- jimsghstars - vthorrf/optimg - General-purpose Gradient-based Optimization (R)
README
optimg: General-purpose Gradient-based Optimization (version 0.1.2)
=============This package is a general purpose tool for helping users to implement gradient descent methods for function optimization. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.
This package should be considered experimental in this point of development.
# Installation #
Using the 'remotes' package:
install.packages("remotes")
remotes::install_github("vthorrf/optimg")