Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mattnedrich/GradientDescentExample
Example demonstrating how gradient descent may be used to solve a linear regression problem
https://github.com/mattnedrich/GradientDescentExample
Last synced: 15 days ago
JSON representation
Example demonstrating how gradient descent may be used to solve a linear regression problem
- Host: GitHub
- URL: https://github.com/mattnedrich/GradientDescentExample
- Owner: mattnedrich
- License: mit
- Created: 2015-03-10T03:10:52.000Z (over 9 years ago)
- Default Branch: master
- Last Pushed: 2023-02-24T13:53:02.000Z (over 1 year ago)
- Last Synced: 2024-08-01T16:37:14.859Z (3 months ago)
- Language: Python
- Size: 9.77 MB
- Stars: 536
- Watchers: 23
- Forks: 299
- Open Issues: 2
-
Metadata Files:
- Readme: readme.md
- License: LICENSE
Awesome Lists containing this project
README
## Gradient Descent Example for Linear Regression
This example project demonstrates how the [gradient descent](http://en.wikipedia.org/wiki/Gradient_descent) algorithm may be used to solve a [linear regression](http://en.wikipedia.org/wiki/Linear_regression) problem. A more detailed description of this example can be found [here](https://spin.atomicobject.com/2014/06/24/gradient-descent-linear-regression/).### Code Requirements
The example code is in Python ([version 2.6](https://www.python.org/doc/versions/) or higher will work). The only other requirement is [NumPy](http://www.numpy.org/).### Description
This code demonstrates how a gradient descent search may be used to solve the linear regression problem of fitting a line to a set of points. In this problem, we wish to model a set of points using a line. The line model is defined by two parameters - the line's slope `m`, and y-intercept `b`. Gradient descent attemps to find the best values for these parameters, subject to an error function.The code contains a main function called `run`. This function defines a set of parameters used in the gradient descent algorithm including an initial guess of the line slope and y-intercept, the learning rate to use, and the number of iterations to run gradient descent for.
```python
initial_b = 0 # initial y-intercept guess
initial_m = 0 # initial slope guess
num_iterations = 1000
```Using these parameters a gradient descent search is executed on a sample data set of 100 ponts. Here is a visualization of the search running for 200 iterations using an initial guess of `m = 0`, `b = 0`, and a learning rate of `0.000005`.
### Execution
To run the example, simply run the `gradient_descent_example.py` file using Python```
python gradient_descent_example.py
```The output will look like this
```
Starting gradient descent at b = 0, m = 0, error = 5565.10783448
Running...
After 1000 iterations b = 0.0889365199374, m = 1.47774408519, error = 112.614810116
```