{"id":34077748,"url":"https://github.com/sigvaldm/localreg","last_synced_at":"2026-03-17T16:08:57.233Z","repository":{"id":57438775,"uuid":"185620541","full_name":"sigvaldm/localreg","owner":"sigvaldm","description":"Multivariate Local Polynomial Regression and Radial Basis Function Regression","archived":false,"fork":false,"pushed_at":"2023-02-06T12:37:53.000Z","size":894,"stargazers_count":51,"open_issues_count":9,"forks_count":5,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-12-16T12:49:24.411Z","etag":null,"topics":["kernel-methods","loess","lowess","multivariate","non-parametric","radial-basis-function","regression"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"lgpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/sigvaldm.png","metadata":{"files":{"readme":"README.rst","changelog":null,"contributing":null,"funding":null,"license":"COPYING","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2019-05-08T14:17:48.000Z","updated_at":"2025-11-05T11:46:18.000Z","dependencies_parsed_at":"2023-02-19T06:16:18.190Z","dependency_job_id":null,"html_url":"https://github.com/sigvaldm/localreg","commit_stats":null,"previous_names":[],"tags_count":8,"template":false,"template_full_name":null,"purl":"pkg:github/sigvaldm/localreg","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sigvaldm%2Flocalreg","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sigvaldm%2Flocalreg/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sigvaldm%2Flocalreg/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sigvaldm%2Flocalreg/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/sigvaldm","download_url":"https://codeload.github.com/sigvaldm/localreg/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sigvaldm%2Flocalreg/sbom","scorecard":{"id":823280,"data":{"date":"2025-08-11","repo":{"name":"github.com/sigvaldm/localreg","commit":"16308847773659e77b0cf945513b8eee2f23c8c8"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":3,"checks":[{"name":"SAST","score":0,"reason":"no SAST tool detected","details":["Warn: no pull requests merged into dev branch"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Token-Permissions","score":-1,"reason":"No tokens found","details":null,"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Dangerous-Workflow","score":-1,"reason":"no workflows found","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Code-Review","score":0,"reason":"Found 0/30 approved changesets -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Pinned-Dependencies","score":-1,"reason":"no dependencies found","details":null,"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Vulnerabilities","score":10,"reason":"0 existing vulnerabilities detected","details":null,"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: COPYING:0","Info: FSF or OSI recognized license: GNU Lesser General Public License v3.0: COPYING:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'master'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}}]},"last_synced_at":"2025-08-23T16:06:59.796Z","repository_id":57438775,"created_at":"2025-08-23T16:06:59.796Z","updated_at":"2025-08-23T16:06:59.796Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30626920,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-17T14:16:03.965Z","status":"ssl_error","status_checked_at":"2026-03-17T14:16:03.380Z","response_time":56,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["kernel-methods","loess","lowess","multivariate","non-parametric","radial-basis-function","regression"],"created_at":"2025-12-14T10:45:04.832Z","updated_at":"2026-03-17T16:08:57.228Z","avatar_url":"https://github.com/sigvaldm.png","language":"Python","readme":".. image:: logo.png\n\n.. image:: https://travis-ci.com/sigvaldm/localreg.svg?branch=master\n    :target: https://app.travis-ci.com/sigvaldm/localreg\n\n.. image:: https://coveralls.io/repos/github/sigvaldm/localreg/badge.svg?branch=master\n    :target: https://coveralls.io/github/sigvaldm/localreg?branch=master\n\n.. image:: https://img.shields.io/pypi/pyversions/localreg.svg\n    :target: https://pypi.org/project/localreg\n\n.. image:: https://zenodo.org/badge/185620541.svg\n    :target: https://zenodo.org/badge/latestdoi/185620541\n\nLocalreg is a collection of kernel-based statistical methods:\n\n- Smoothing of noisy data series through multivariate *local polynomial regression* (including LOESS/LOWESS).\n- Multivariate and complex-valued *radial basis function* (RBF) regression.\n\nInstallation\n------------\nInstall from PyPI using ``pip`` (preferred method)::\n\n    pip install localreg\n\nOr download the GitHub repository https://github.com/sigvaldm/localreg.git and run::\n\n    python setup.py install\n\nLocal polynomial regression\n---------------------------\n\nIntroduction\n~~~~~~~~~~~~\nLocal polynomial regression is performed using the function::\n\n    localreg(x, y, x0=None, degree=2, kernel=rbf.epanechnikov, radius=1, frac=None)\n\nwhere ``x`` and ``y`` are the x and y-values of the data to smooth, respectively.\n``x0`` is the x-values at which to compute smoothed values. By default this is the same as ``x``, but beware that the run time is proportional to the size of ``x0``, so if you have many datapoints, it may be worthwhile to specify a smaller ``x0`` yourself.\n\nLocal polynomial regression works by fitting a polynomial of degree ``degree`` to the datapoints in vicinity of where you wish to compute a smoothed value (``x0``), and then evaluating that polynomial at ``x0``. For ``degree=0`` it reduces to a weighted moving average. A weighting function or kernel ``kernel`` is used to assign a higher weight to datapoints near ``x0``. The argument to ``kernel`` is a pure function of one argument so it is possible to define custom kernels. The following kernels are already implemented:\n\n- ``rectangular``\n- ``triangular``\n- ``epanechnikov``\n- ``biweight``\n- ``triweight``\n- ``tricube``\n- ``cosine``\n- ``gaussian`` (non-compact)\n- ``logistic`` (non-compact)\n- ``sigmoid`` (non-compact)\n- ``silverman`` (non-compact)\n\nHaving a kernel wich tapers off toward the edges, i.e., not a rectangular kernel, results in a smooth output.\n\nThe radius of the kernel can be scaled by the parameter ``radius``, which in 1D is half of the kernel-width for kernels with compact support. For kernels with non-compact support, like the Gaussian kernel, it is simply a scaling parameter, akin to the standard deviation. Having a wider kernel and including more datapoints lowers the noise (variance) but increases the bias as the regression will not be able to capture variations on a scale much narrower than the kernel window.\n\nFor unevenly spaced datapoints, having a fixed radius means that a variable number of datapoints are included in the window, and hence the noise/variance is variable too. However, the bias is fixed. Using a radius that varies such that a fixed number of datapoints is included leads instead to constant noise/variance but fixed bias. This can be acheived by specifying ``frac`` which overrules ``radius`` and specifies the fraction of all datapoints to be included in the radius of the kernel.\n\nExample 1\n~~~~~~~~~\nThe below example exhibits several interesting features::\n\n    import numpy as np\n    import matplotlib.pyplot as plt\n    from localreg import *\n\n    np.random.seed(1234)\n    x = np.linspace(1.5, 5, 2000)\n    yf = np.sin(x*x)\n    y = yf + 0.5*np.random.randn(*x.shape)\n\n    y0 = localreg(x, y, degree=0, kernel=rbf.tricube, radius=0.3)\n    y1 = localreg(x, y, degree=1, kernel=rbf.tricube, radius=0.3)\n    y2 = localreg(x, y, degree=2, kernel=rbf.tricube, radius=0.3)\n\n    plt.plot(x, y, '+', markersize=0.6, color='gray')\n    plt.plot(x, yf, label='Ground truth ($\\sin(x^2)$)')\n    plt.plot(x, y0, label='Moving average')\n    plt.plot(x, y1, label='Local linear regression')\n    plt.plot(x, y2, label='Local quadratic regression')\n    plt.legend()\n    plt.show()\n\n.. image:: examples/basic.png\n\nIf there's a slope in the data near an edge, a simple moving average will fail to take into account the slope, as seen in the figure, since most of the datapoints will be to the right (or left) of ``x0``. A local linear (or higher order regression) is able to compensate for this. We also see that as the frequency of the oscillations increases, the local linear regression is not able to keep up, because the variations become too small compared to the window. A smaller window would help, at the cost of more noise in the regression. Another option is to increase the degree to 2. The quadratic regression is better at following the valleys and the hills. For too rapid changes compared to the kernel, however, quadratic polynomials will also start failing.\n\nIt is also worth noting that a higher degree also comes with an increase in variance, which can show up as small spurious oscillations. It is therefore not very common to go higher than 2, although localreg supports arbitrary degree.\n\nExample 2\n~~~~~~~~~\nFor multivariate input, the coordinates of data point ``i`` are given by ``x[i,:]``. This example has 2 inputs::\n\n    from localreg import *\n    import matplotlib.pyplot as plt\n    from mpl_toolkits.mplot3d import Axes3D # Axes3D import has side effects, it enables using projection='3d' in add_subplot\n    import numpy as np\n\n    N = 500\n    degree=1\n\n    x = np.random.rand(N,2)\n    y = np.cos(2*np.pi*x[:,0])*(1-x[:,1])\n\n    fig = plt.figure()\n    ax = fig.add_subplot(111, projection='3d')\n\n    m = np.arange(0, 1.05, 0.05)\n    X, Y = np.meshgrid(m,m)\n    x0 = np.array([np.ravel(X), np.ravel(Y)]).T\n    z0 = localreg(x, y, x0, degree=degree, radius=0.2)\n    Z = z0.reshape(X.shape)\n\n    ax.plot_wireframe(X, Y, Z, rcount=10, ccount=10, color='green')\n    ax.plot3D(x[:,0], x[:,1], y, '.')\n\n    ax.set_xlabel('X')\n    ax.set_ylabel('Y')\n    ax.set_zlabel('Z')\n\n    plt.show()\n\n.. image:: examples/multivariate.png\n\n.. [Hastie] T. Hastie, R. Tibshirani and J. Friedman *The Elements of Statistical Learing -- Data Mining, Inference, and Prediction*, Second Edition, Springer, 2017.\n.. [Cleveland] W. Cleveland *Robust Locally Weighted Regression and Smoothing Scatterplots*, Journal of the Americal Statistical Associations, 74, 1979.\n\nExample 3\n~~~~~~~~~\n``localreg()`` uses the function ``polyfit()`` internally to evaluate polynomial fits locally. It is also possible to use ``polyfit()`` directly, should a standard (non-local) polynomial fit be desired instead::\n\n    from localreg import *\n    import matplotlib.pyplot as plt\n    from mpl_toolkits.mplot3d import Axes3D # Axes3D import has side effects, it enables using projection='3d' in add_subplot\n    import numpy as np\n\n    N = 50\n    degree=2\n\n    x = np.random.rand(N,2)\n    y = x[:,0]*x[:,1] + 0.02*np.random.randn(N)\n\n    fig = plt.figure()\n    ax = fig.add_subplot(111, projection='3d')\n\n    m = np.arange(0, 1.05, 0.05)\n    X, Y = np.meshgrid(m,m)\n    x0 = np.array([np.ravel(X), np.ravel(Y)]).T\n    z0 = polyfit(x, y, x0, degree=degree)\n    Z = z0.reshape(X.shape)\n\n    ax.plot_wireframe(X, Y, Z, rcount=10, ccount=10, color='green')\n    ax.plot3D(x[:,0], x[:,1], y, 'o')\n\n    ax.set_xlabel('X')\n    ax.set_ylabel('Y')\n    ax.set_zlabel('Z')\n\n    plt.show()\n\n.. image:: examples/polyfit.png\n \nRadial basis function (RBF) network\n-----------------------------------\n\nIntroduction\n~~~~~~~~~~~~\nAn RBF network is a simple machine learning network suitable for mesh-free regression in multiple dimensions. It is robust, easy to understand, and although it is not a universal method, it works well for some problems.\n\nA radial basis function is a function ``g(t)``, possibly with a multidimensional domain, but which only depends on the radial distance ``t`` of the input with respect to the origin of the RBF. An RBF network is then a weighted sum of such functions, with displaced centers::\n\n    y_i = sum_j w_j g(||x_j-c_j||/r)\n\nThis sum is fitted to a set of data points ``(x,y)``. Typically, the RBF is a Gaussian function, although any it can be any function of one argument (the radial distance), for instance any of the kernals listed above. In ``RBFnet``, the centers ``c_j`` are first determined to get a good coverage of the domain by means of K-means clustering. The radius ``r``, here taken to be the same for all terms, is a hyperparameter to be tuned. With this, linear least squares is used to fit the weights ``w_j``. Both the input and output can be complex-valued.\n\nExample 1: Getting started\n~~~~~~~~~~~~~~~~~~~~~~~~~~\nThis example demonstrates how 10 radial basis functions can be used to fit a sine curve::\n\n    from localreg import RBFnet\n    import numpy as np\n    import matplotlib.pyplot as plt\n\n    x = np.linspace(0,1,100)\n    y = np.sin(2*np.pi*x)\n\n    net = RBFnet()\n    net.train(x, y, num=10, radius=0.3)\n\n    plt.plot(x, y, label='Ground truth')\n    net.plot_bases(plt.gca(), x, label='Prediction')\n    plt.legend()\n    plt.show()\n\n.. image:: examples/rbf1.png\n\nThe dashed lines plotted using the ``plot_bases`` method are the individual terms in the weighted sum after training. The learning capacity of an RBF network is primarily determined by the number of basis functions, decided by the ``num`` parameter. In this case 10 basis functions makes for a good fit, but data with larger variability and more dimensions may require more basis functions. Other parameters that can be adjusted is the radius of the basis functions, as well as the analytical expression of the radial basis function itself. The radius is in terms of standard deviations of the input points, and is therefore always a number of order of magnitude one. By default Gaussian basis functions are used, but any of the kernels mentioned for local polynomial regression can be specified using the ``rbf`` parameter, as well as custom functions of one argument. Normalization can be turned off using the ``normalize`` argument. In this case the radius has similar magnitude as the input.\n\nExample 2: Multivariate input\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\nThis example demonstrates multi-dimensional inputs. Due to the larger variability more basis functions are needed than in example 1. We also do not specify the radius in this case, but allow ``RBFnet`` to use an internal algorithm for choosing the radius that minimizes the RMS error (other error measures may be specified using the ``measure`` parameter). While automatically tuning the radius works well in this example, it must be considered an experimental feature. It is also more time-consuming::\n\n    from localreg import RBFnet, plot_corr\n    import numpy as np\n    import matplotlib.pyplot as plt\n    from mpl_toolkits.mplot3d import Axes3D # Enables 3d-projection \n\n    x = np.linspace(0,2,30)\n    X, Y = np.meshgrid(x, x)\n\n    input = np.array([X.ravel(), Y.ravel()]).T\n    x, y = input.T\n    z = y*np.sin(2*np.pi*x)\n\n    net = RBFnet()\n    net.train(input, z, num=50)\n    z_hat = net.predict(input)\n\n    fig = plt.figure()\n    ax = fig.add_subplot(111, projection='3d')\n    ax.plot_wireframe(X, Y, z.reshape(X.shape), rcount=20, ccount=20)\n    ax.plot_surface(X, Y, z_hat.reshape(X.shape), alpha=0.5, color='green')\n    plt.show()\n\n    fig, ax = plt.subplots()\n    plot_corr(ax, z, z_hat)\n    plt.show()\n\n.. image:: examples/rbf2a.png\n.. image:: examples/rbf2b.png\n\nThe figures show excellent agreement between the true and predicted data. In the first plot the wirefram is the true data, whereas the surface is the predicted data. The function ``plot_corr`` is handy to visualize the agreement between true and predicted data.\n\nWhen using multi-dimensional data normalization becomes more important. If the input variables have different standard deviation, e.g., if they are variables of entirely different physical dimensions, it will be difficult to adapt the network with few basis functions of radial shape, because it will be difficult to resolve the details in the \"small\" axes while spanning the data in the \"large\" axes. Normalization make the spread along the axes more comparable.\n\nExample 3: Error metrics and relative least squares\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\nLocalreg comes with several error metrics for quantifying the error:\n\n- ``rms_error``\n- ``rms_rel_error``\n- ``max_abs_error``\n- ``max_rel_error``\n- ``mean_abs_error``\n- ``mean_rel_error``\n- ``error_bias``\n- ``rel_error_bias``\n- ``error_std``\n- ``rel_error_std``\n\nThis example demonstates a couple of these, as well as a special modification to the least squares algorithm available in ``RBFnet``::\n\n    from localreg import RBFnet\n    from localreg.metrics import rms_error, rms_rel_error\n    from frmt import print_table\n    import matplotlib.pyplot as plt\n    import numpy as np\n\n    x = np.linspace(0,0.49,100)\n    y = np.tan(np.pi*x)+1\n\n    net = RBFnet()\n\n    net.train(x, y, radius=1)\n    y_hat0 = net.predict(x)\n\n    net.train(x, y, radius=1, relative=True)\n    y_hat1 = net.predict(x)\n\n    print_table(\n        [[''            , 'RMSE'              , 'RMSRE'                  ],\n         ['Normal LLS'  , rms_error(y, y_hat0), rms_rel_error(y , y_hat0)],\n         ['Relative LLS', rms_error(y, y_hat1), rms_rel_error(y , y_hat1)]]\n    )\n\n    plt.figure()\n    plt.plot(x, y, label='Ground truth')\n    plt.plot(x, y_hat0, label='Normal LLS')\n    plt.plot(x, y_hat1, label='Relative LLS')\n    plt.legend()\n    plt.show()\n\nOutput::\n\n                  RMSE  RMSRE \n    Normal LLS    0.65  0.17  \n    Relative LLS  1.14  0.0457\n\n.. image:: examples/rbf3.png\n\nThis example fits the data to a tan-function, which becomes very large towards the right edge. Linear least squares (LLS) algorithms solves the so-called normal equations, which is equivalent to minimizing the squared sum of residuals or the root-mean-square (RMS) of the error. When the data spans a large range, the error can quickly become very large for the smaller values, because the algorithm optimizes the errors in absolute terms. In this example, the linear least squares algorithm makes a poor (and oscialltory) prediction of smaller values, because the absolute error in the larger values are made smaller that way. However, when working on data spanning several orders of magnitude, the relative error is often more important. By training with ``relative=True``, the normal equations are preconditioned such that the root-mean-square of the relative errors (RMSE) are minimized instead of RMSE.\n\nExample 4: Multivariate output\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\nBoth the input and the output may be multidimensional. In this example, the input is univariate, but the output multivariate.\n\nOutput::\n\n    from localreg import RBFnet\n    import numpy as np\n    import matplotlib.pyplot as plt\n\n    x = np.linspace(0,1,100)\n    y = np.zeros((len(x), 2))\n    y[:,0] = np.sin(2*np.pi*x)\n    y[:,1] = np.cos(2*np.pi*x)\n\n    net = RBFnet()\n    net.train(x, y)\n    yhat = net.predict(x)\n\n    plt.plot(x, y[:,0], 'C0', label='Ground truth')\n    plt.plot(x, y[:,1], 'C1', label='Ground truth')\n    plt.plot(x, yhat[:,0], ':k', label='Prediction')\n    plt.plot(x, yhat[:,1], ':k', label='Prediction')\n    plt.legend()\n    plt.show()\n\n.. image:: examples/rbf4.png\n\nExample 5: Matrix- or tensor-valued input and output\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\nIt is also possible to use `RBFnet` on matrix- or tensor-valued input and output, although this requires the user to reshape the input and output. A matrix or a tensor input is nothing more than a multivariate input, with the inputs arranged in a particular shape. Thus, to use `RBFnet`, use Numpy's `reshape` method to make it conforming to `RBFnet`. The output can likewise be reshaped prior to training, and be shaped back to a matrix or tensor after prediction.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsigvaldm%2Flocalreg","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsigvaldm%2Flocalreg","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsigvaldm%2Flocalreg/lists"}