Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tbeason/nonparametricregression.jl
Simple local constant and local linear regressions in Julia
https://github.com/tbeason/nonparametricregression.jl
julia kernel-smoothing nonparametric-regression
Last synced: about 2 months ago
JSON representation
Simple local constant and local linear regressions in Julia
- Host: GitHub
- URL: https://github.com/tbeason/nonparametricregression.jl
- Owner: tbeason
- License: other
- Created: 2021-12-21T21:55:42.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2023-04-28T19:07:46.000Z (over 1 year ago)
- Last Synced: 2024-10-09T17:24:02.426Z (3 months ago)
- Topics: julia, kernel-smoothing, nonparametric-regression
- Language: Julia
- Homepage:
- Size: 30.3 KB
- Stars: 7
- Watchers: 2
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# NonparametricRegression.jl
![lifecycle](https://img.shields.io/badge/lifecycle-experimental-orange.svg)
[![build](https://github.com/tbeason/NonparametricRegression.jl/workflows/CI/badge.svg)](https://github.com/tbeason/NonparametricRegression.jl/actions?query=workflow%3ACI)
[![codecov.io](http://codecov.io/github/tbeason/NonparametricRegression.jl/coverage.svg?branch=main)](http://codecov.io/github/tbeason/NonparametricRegression.jl?branch=main)This package implements non-parametric regression, also called local regression or kernel regression. Currently the functionality is limited to univariate regressions and to only the local constant (`localconstant`) and local linear (`locallinear`,`llalphabeta`) estimators. Automatic bandwidth selection is done by leave-one-out cross validation or by optimizing the bias-corrected AICc statistic.
The two important exported convenience methods are `npregress` and `optimalbandwidth` which abstract from a lot of the implementation detail and allow you to easily switch estimators or bandwidth selection procedures.
## Examples
```julia
using NonparametricRegressionnpregress
```
## Detail
- Scaled Gaussian kernel (`GaussianKernel`) by default (aliased by `NormalKernel(h)` where `h` is the bandwidth). Other available kernels are `UniformKernel` and `EpanechnikovKernel`. Adding a new kernel would be a relatively easy PR, see `src/kernels.jl`.
- For local linear estimation, two functions are provided. The first is `locallinear` which explicitly computes a weighted average of `y` as in `localconstant`. The second is `llalphabeta` which computes (and returns) the intercept and slope terms of the local linear regression, the intercept of which is the expected `y`. `llalphabeta` requires only one pass over the data, so is more performant than `locallinear` because computing the weights requires 2 passes, but the results are identical modulo any small numerical epsilons.
- Care was taken to make things non-allocating and performant. The package does not use the "binning" technique that other packages use (R's KernSmooth, for example), so on very large datasets there could be a performance loss relative to those packages. The package does not use multithreading, so again some performance gain could be had here if needed. PRs welcome.## Related
KernelDensity.jl is a nice package for doing kernel density estimation.
KernelEstimators.jl is an outdated package which I found after already implementing most of this package. Consider this an updated version I guess.
LOESS.jl is a package implementing a similar but different type of local regression (loess, obviously).