https://github.com/stdlib-js/ml-incr-sgd-regression
Online regression via stochastic gradient descent (SGD).
https://github.com/stdlib-js/ml-incr-sgd-regression
algorithm gradient-descent incremental javascript machine-learning math mathematics ml node node-js nodejs online prediction regression statistics stats stdlib
Last synced: 12 days ago
JSON representation
Online regression via stochastic gradient descent (SGD).
- Host: GitHub
- URL: https://github.com/stdlib-js/ml-incr-sgd-regression
- Owner: stdlib-js
- License: apache-2.0
- Created: 2021-06-15T17:47:36.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2026-02-08T05:37:45.000Z (about 1 month ago)
- Last Synced: 2026-02-08T12:24:35.994Z (about 1 month ago)
- Topics: algorithm, gradient-descent, incremental, javascript, machine-learning, math, mathematics, ml, node, node-js, nodejs, online, prediction, regression, statistics, stats, stdlib
- Language: JavaScript
- Homepage: https://github.com/stdlib-js/stdlib
- Size: 3.59 MB
- Stars: 6
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
- Security: SECURITY.md
- Notice: NOTICE
Awesome Lists containing this project
README
About stdlib...
We believe in a future in which the web is a preferred environment for numerical computation. To help realize this future, we've built stdlib. stdlib is a standard library, with an emphasis on numerical and scientific computation, written in JavaScript (and C) for execution in browsers and in Node.js.
The library is fully decomposable, being architected in such a way that you can swap out and mix and match APIs and functionality to cater to your exact preferences and use cases.
When you use stdlib, you can be absolutely certain that you are using the most thorough, rigorous, well-written, studied, documented, tested, measured, and high-quality code out there.
To join us in bringing numerical computing to the web, get started by checking us out on GitHub, and please consider financially supporting stdlib. We greatly appreciate your continued support!
# Online Regression
[![NPM version][npm-image]][npm-url] [![Build Status][test-image]][test-url] [![Coverage Status][coverage-image]][coverage-url]
> Online regression via [Stochastic Gradient Descent][stochastic-gradient-descent].
## Installation
```bash
npm install @stdlib/ml-incr-sgd-regression
```
Alternatively,
- To load the package in a website via a `script` tag without installation and bundlers, use the [ES Module][es-module] available on the [`esm`][esm-url] branch (see [README][esm-readme]).
- If you are using Deno, visit the [`deno`][deno-url] branch (see [README][deno-readme] for usage intructions).
- For use in Observable, or in browser/node environments, use the [Universal Module Definition (UMD)][umd] build available on the [`umd`][umd-url] branch (see [README][umd-readme]).
The [branches.md][branches-url] file summarizes the available branches and displays a diagram illustrating their relationships.
To view installation and usage instructions specific to each branch build, be sure to explicitly navigate to the respective README files on each branch, as linked to above.
## Usage
```javascript
var incrSGDRegression = require( '@stdlib/ml-incr-sgd-regression' );
```
#### incrSGDRegression( \[options] )
Creates an online linear regression model fitted via [stochastic gradient descent][stochastic-gradient-descent]. The module performs [L2 regularization][l2-regularization] of the model coefficients, shrinking them towards zero by penalizing the squared [euclidean norm][euclidean-norm] of the coefficients.
```javascript
var randu = require( '@stdlib/random-base-randu' );
var normal = require( '@stdlib/random-base-normal' );
var accumulator = incrSGDRegression();
var x1;
var x2;
var i;
var y;
// Update model as data comes in...
for ( i = 0; i < 100000; i++ ) {
x1 = randu();
x2 = randu();
y = (3.0 * x1) + (-3.0 * x2) + 2.0 + normal( 0.0, 1.0 );
accumulator( [ x1, x2 ], y );
}
```
The function accepts the following `options`:
- **learningRate**: `string` denoting the learning rate to use. Can be `constant`, `pegasos` or `basic`. Default: `basic`.
- **loss**: `string` denoting the loss function to use. Can be `squaredError`, `epsilonInsensitive` or `huber`. Default: `squaredError`.
- **epsilon**: insensitivity parameter. Default: `0.1`.
- **lambda**: regularization parameter. Default: `1e-3`.
- **eta0**: constant learning rate. Default: `0.02`.
- **intercept**: `boolean` indicating whether to include an intercept. Default: `true`.
```javascript
var accumulator = incrSGDRegression({
'loss': 'squaredError',
'lambda': 1e-4
});
```
The `learningRate` decides how fast or slow the weights will be updated towards the optimal weights. Let `i` denote the current iteration of the algorithm (i.e. the number of data points having arrived). The possible learning rates are:
| Option | Definition |
| :-------------: | :---------------------: |
| basic (default) | 1000.0 / ( i + 1000.0 ) |
| constant | eta0 |
| pegasos | 1.0 / ( lambda \* i ) |
The used loss function is specified via the `loss` option. The available options are:
- **epsilonInsensitive**: Penalty is the absolute value of the error whenever the absolute error exceeds epsilon and zero otherwise.
- **huber**: Squared-error loss for observations with error smaller than epsilon in magnitude, linear loss otherwise. Should be used in order to decrease the influence of outliers on the model fit.
- **squaredError**: Squared error loss, i.e. the squared difference of the observed and fitted values.
The `lambda` parameter determines the amount of shrinkage inflicted on the model coefficients:
```javascript
var createRandom = require( '@stdlib/random-base-randu' ).factory;
var accumulator;
var coefs;
var opts;
var rand;
var x1;
var x2;
var i;
var y;
opts = {
'seed': 23
};
rand = createRandom( opts );
accumulator = incrSGDRegression({
'lambda': 1e-5
});
for ( i = 0; i < 100; i++ ) {
x1 = rand();
x2 = rand();
y = (3.0 * x1) + (-3.0 * x2) + 2.0;
accumulator( [ x1, x2 ], y );
}
coefs = accumulator.coefs;
// returns [ ~3.007, ~-3.002, ~2 ]
rand = createRandom( opts );
accumulator = incrSGDRegression({
'lambda': 1e-2
});
for ( i = 0; i < 100; i++ ) {
x1 = rand();
x2 = rand();
y = (3.0 * x1) + (-3.0 * x2) + 2.0;
accumulator( [ x1, x2 ], y );
}
coefs = accumulator.coefs;
// returns [ ~2.893, ~-2.409, ~1.871 ]
```
Higher values of `lambda` reduce the variance of the model coefficient estimates at the expense of introducing bias.
By default, the model contains an `intercept` term. To omit the `intercept`, set the corresponding option to `false`:
```javascript
var accumulator = incrSGDRegression({
'intercept': false
});
accumulator( [ 1.4, 0.5 ], 2.0 );
var dim = accumulator.coefs.length;
// returns 2
accumulator = incrSGDRegression();
accumulator( [ 1.4, 0.5 ], 2.0 );
dim = accumulator.coefs.length;
// returns 3
```
If `intercept` is `true`, an element equal to one is implicitly added to each `x` vector. Hence, this module performs regularization of the intercept term.
#### accumulator( x, y )
Updates the model coefficients in light of incoming data. `y` must be a numeric response value, `x` a `numeric array` of predictors. The number of predictors is decided upon first invocation of this method. All subsequent calls must supply `x` vectors of the same dimensionality.
```javascript
accumulator( [ 1.0, 0.0 ], 5.0 );
```
#### accumulator.predict( x )
Predicts the response for a new feature vector `x`, where `x` must be a `numeric array` of predictors. Given feature vector `x = [x_0, x_1, ...]` and model coefficients `c = [c_0, c_1, ...]`, the prediction is equal to `x_0*c_0 + x_1*c_1 + ... + c_intercept`.
```javascript
var yhat = accumulator.predict( [ 0.5, 2.0 ] );
// returns
```
#### accumulator.coefs
Getter for the model coefficients / feature weights stored in an `array`. The coefficients are ordered as `[c_0, c_1,..., c_intercept]`, where `c_0` corresponds to the first feature in `x` and so on.
```javascript
var coefs = accumulator.coefs;
// returns
```
## Notes
- Stochastic gradient descent is sensitive to the scaling of the features. One is best advised to either scale each attribute to `[0,1]` or `[-1,1]` or to transform them into z-scores with zero mean and unit variance. One should keep in mind that the same scaling has to be applied to test vectors in order to obtain accurate predictions.
- Since this module performs regularization of the intercept term, scaling the response variable to an appropriate scale is also highly recommended.
## Examples
```javascript
var randu = require( '@stdlib/random-base-randu' );
var normal = require( '@stdlib/random-base-normal' );
var incrSGDRegression = require( '@stdlib/ml-incr-sgd-regression' );
var accumulator;
var rnorm;
var x1;
var x2;
var y;
var i;
rnorm = normal.factory( 0.0, 1.0 );
// Create model:
accumulator = incrSGDRegression({
'lambda': 1e-7,
'loss': 'squaredError',
'intercept': true
});
// Update model as data comes in...
for ( i = 0; i < 10000; i++ ) {
x1 = randu();
x2 = randu();
y = (3.0 * x1) + (-3.0 * x2) + 2.0 + rnorm();
accumulator( [ x1, x2 ], y );
}
// Extract model coefficients:
console.log( accumulator.coefs );
// Predict new observations:
console.log( 'y_hat = %d; x1 = %d; x2 = %d', accumulator.predict( [0.9, 0.1] ), 0.9, 0.1 );
console.log( 'y_hat = %d; x1 = %d; x2 = %d', accumulator.predict( [0.1, 0.9] ), 0.1, 0.9 );
console.log( 'y_hat = %d; x1 = %d; x2 = %d', accumulator.predict( [0.9, 0.9] ), 0.9, 0.9 );
```
* * *
## See Also
- [`@stdlib/ml-incr/binary-classification`][@stdlib/ml/incr/binary-classification]: incrementally perform binary classification using stochastic gradient descent (SGD).
* * *
## Notice
This package is part of [stdlib][stdlib], a standard library for JavaScript and Node.js, with an emphasis on numerical and scientific computing. The library provides a collection of robust, high performance libraries for mathematics, statistics, streams, utilities, and more.
For more information on the project, filing bug reports and feature requests, and guidance on how to develop [stdlib][stdlib], see the main project [repository][stdlib].
#### Community
[![Chat][chat-image]][chat-url]
---
## License
See [LICENSE][stdlib-license].
## Copyright
Copyright © 2016-2026. The Stdlib [Authors][stdlib-authors].
[npm-image]: http://img.shields.io/npm/v/@stdlib/ml-incr-sgd-regression.svg
[npm-url]: https://npmjs.org/package/@stdlib/ml-incr-sgd-regression
[test-image]: https://github.com/stdlib-js/ml-incr-sgd-regression/actions/workflows/test.yml/badge.svg?branch=v0.2.3
[test-url]: https://github.com/stdlib-js/ml-incr-sgd-regression/actions/workflows/test.yml?query=branch:v0.2.3
[coverage-image]: https://img.shields.io/codecov/c/github/stdlib-js/ml-incr-sgd-regression/main.svg
[coverage-url]: https://codecov.io/github/stdlib-js/ml-incr-sgd-regression?branch=main
[chat-image]: https://img.shields.io/badge/zulip-join_chat-brightgreen.svg
[chat-url]: https://stdlib.zulipchat.com
[stdlib]: https://github.com/stdlib-js/stdlib
[stdlib-authors]: https://github.com/stdlib-js/stdlib/graphs/contributors
[umd]: https://github.com/umdjs/umd
[es-module]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules
[deno-url]: https://github.com/stdlib-js/ml-incr-sgd-regression/tree/deno
[deno-readme]: https://github.com/stdlib-js/ml-incr-sgd-regression/blob/deno/README.md
[umd-url]: https://github.com/stdlib-js/ml-incr-sgd-regression/tree/umd
[umd-readme]: https://github.com/stdlib-js/ml-incr-sgd-regression/blob/umd/README.md
[esm-url]: https://github.com/stdlib-js/ml-incr-sgd-regression/tree/esm
[esm-readme]: https://github.com/stdlib-js/ml-incr-sgd-regression/blob/esm/README.md
[branches-url]: https://github.com/stdlib-js/ml-incr-sgd-regression/blob/main/branches.md
[stdlib-license]: https://raw.githubusercontent.com/stdlib-js/ml-incr-sgd-regression/main/LICENSE
[euclidean-norm]: https://en.wikipedia.org/wiki/Norm_(mathematics)#Euclidean_norm
[l2-regularization]: https://en.wikipedia.org/wiki/Tikhonov_regularization
[stochastic-gradient-descent]: https://en.wikipedia.org/wiki/Stochastic_gradient_descent
[@stdlib/ml/incr/binary-classification]: https://github.com/stdlib-js/ml-incr-binary-classification