https://github.com/fluxml/flux.jl
Relax! Flux is the ML library that doesn't make you tensor
https://github.com/fluxml/flux.jl
data-science deep-learning flux machine-learning neural-networks the-human-brain
Last synced: 2 months ago
JSON representation
Relax! Flux is the ML library that doesn't make you tensor
- Host: GitHub
- URL: https://github.com/fluxml/flux.jl
- Owner: FluxML
- License: other
- Created: 2016-04-01T21:11:05.000Z (over 9 years ago)
- Default Branch: master
- Last Pushed: 2025-04-15T01:40:24.000Z (3 months ago)
- Last Synced: 2025-05-14T22:05:04.911Z (2 months ago)
- Topics: data-science, deep-learning, flux, machine-learning, neural-networks, the-human-brain
- Language: Julia
- Homepage: https://fluxml.ai/
- Size: 12.3 MB
- Stars: 4,629
- Watchers: 92
- Forks: 611
- Open Issues: 271
-
Metadata Files:
- Readme: README.md
- Changelog: NEWS.md
- Contributing: CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE.md
- Citation: CITATION.bib
Awesome Lists containing this project
README
![]()
![]()
[](https://fluxml.github.io/Flux.jl/stable/)
[](https://fluxml.github.io/Flux.jl/dev/)
[](https://doi.org/10.21105/joss.00602) [](http://juliapkgstats.com/pkg/Flux)
[![][action-img]][action-url] [![][codecov-img]][codecov-url] [](https://github.com/SciML/ColPrac)[action-img]: https://github.com/FluxML/Flux.jl/workflows/CI/badge.svg
[action-url]: https://github.com/FluxML/Flux.jl/actions
[codecov-img]: https://codecov.io/gh/FluxML/Flux.jl/branch/master/graph/badge.svg
[codecov-url]: https://codecov.io/gh/FluxML/Flux.jlFlux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.
Works best with [Julia 1.10](https://julialang.org/downloads/) or later. Here's a very short example to try it out:
```julia
using Flux
data = [(x, 2x-x^3) for x in -2:0.1f0:2]model = let
w, b, v = (randn(Float32, 23) for _ in 1:3) # parameters
x -> sum(v .* tanh.(w*x .+ b)) # callable
end
# model = Chain(vcat, Dense(1 => 23, tanh), Dense(23 => 1, bias=false), only)opt_state = Flux.setup(Adam(), model)
for epoch in 1:100
Flux.train!((m,x,y) -> (m(x) - y)^2, model, data, opt_state)
endusing Plots
plot(x -> 2x-x^3, -2, 2, label="truth")
scatter!(model, -2:0.1f0:2, label="learned")
```
In Flux 0.15, almost any parameterised function in Julia is a valid Flux model -- such as this closure over `w, b, v`. The same function can also be implemented with built-in layers as shown.The [quickstart page](https://fluxml.ai/Flux.jl/stable/guide/models/quickstart/) has a longer example. See the [documentation](https://fluxml.github.io/Flux.jl/) for details, or the [model zoo](https://github.com/FluxML/model-zoo/) for examples. Ask questions on the [Julia discourse](https://discourse.julialang.org/) or [slack](https://discourse.julialang.org/t/announcing-a-julia-slack/4866).
If you use Flux in your research, please [cite](CITATION.bib) our work.