https://github.com/lukks/neural-go
Genetic Neural Networks
https://github.com/lukks/neural-go
genetic go golang networks neural
Last synced: 3 months ago
JSON representation
Genetic Neural Networks
- Host: GitHub
- URL: https://github.com/lukks/neural-go
- Owner: LuKks
- License: mit
- Created: 2019-11-29T00:49:19.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2020-03-24T02:36:29.000Z (over 5 years ago)
- Last Synced: 2024-10-23T23:29:57.117Z (12 months ago)
- Topics: genetic, go, golang, networks, neural
- Language: Go
- Homepage:
- Size: 72.3 KB
- Stars: 15
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# neural-go
Genetic Neural Networks
[](https://goreportcard.com/report/github.com/LuKks/neural-go)  [](https://godoc.org/github.com/LuKks/neural-go) 
```golang
package mainimport (
"fmt"
"github.com/lukks/neural-go/v3"
)func main() {
xor := neural.NewNeural([]*neural.Layer{
{Inputs: 2, Units: 16},
{Units: 16},
{Units: 1},
})for i := 0; i <= 5000; i++ {
loss := xor.Learns([][][]float64{
{{0, 0}, {0}},
{{1, 0}, {1}},
{{0, 1}, {1}},
{{1, 1}, {0}},
})if i%1000 == 0 {
fmt.Printf("iter %v, loss %f\n", i, loss)
}
}fmt.Printf("think some values:\n")
fmt.Printf("0, 0 [0] -> %f\n", xor.Think([]float64{0, 0}))
fmt.Printf("1, 0 [1] -> %f\n", xor.Think([]float64{1, 0}))
fmt.Printf("0, 1 [1] -> %f\n", xor.Think([]float64{0, 1}))
fmt.Printf("1, 1 [0] -> %f\n", xor.Think([]float64{1, 1}))
}
```## Install latest version
```
go get github.com/lukks/neural-go/v3
```Also find versions on [releases](https://github.com/LuKks/neural-go/releases).
The changes from v2 to v3 were just for go mod versioning.## Features
#### Range
Set a range of values for every input and output.\
So you use your values as you know but the neural get it in raw activation.\
Check [examples/rgb.go](https://github.com/LuKks/neural-go/blob/master/examples/rgb.go) for usage example.#### Customizable
Set different activations, rates, momentums, etc at layer level.
- Activation: `linear`, `sigmoid` (default), `tanh` and `relu`
- Learning Rate
- Optimizer by Momentum
- Loss: for output layer, only `mse` for now
- Range: for input and output layerCheck [examples/layers.go](https://github.com/LuKks/neural-go/blob/master/examples/layers.go) for complete example.
#### Genetics
Clone, mutate and crossover neurons, layers and neurals.\
The `Evolve` method internally uses these methods to put this very easy.\
Check [examples/evolve.go](https://github.com/LuKks/neural-go/blob/master/examples/evolve.go) but it's optional, not always need to use genetics.#### Utils
There are several useful methods: Export, Import, Reset, ToFile, FromFile, etc.\
Check the [documentation here](https://godoc.org/github.com/LuKks/neural-go).#### Description
From my previous [neural-amxx](https://github.com/LuKks/neural-amxx).## Examples
Basic XOR [examples/xor.go](https://github.com/LuKks/neural-go/blob/master/examples/xor.go)\
RGB brightness [examples/rgb.go](https://github.com/LuKks/neural-go/blob/master/examples/rgb.go)\
Genetics [examples/evolve.go](https://github.com/LuKks/neural-go/blob/master/examples/evolve.go)\
Layer configs [examples/layers.go](https://github.com/LuKks/neural-go/blob/master/examples/layers.go)\
Persist [examples/persist.go](https://github.com/LuKks/neural-go/blob/master/examples/persist.go)```
go run examples/rgb.go
```## Tests
```
There are no tests yet
```## Issues
Feedback, ideas, etc are very welcome so feel free to open an issue.## License
Code released under the [MIT License](https://github.com/LuKks/neural-go/blob/master/LICENSE).