Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dmlc/mxnet
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
https://github.com/dmlc/mxnet
mxnet
Last synced: 3 months ago
JSON representation
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
- Host: GitHub
- URL: https://github.com/dmlc/mxnet
- Owner: apache
- License: apache-2.0
- Archived: true
- Created: 2015-04-30T16:21:15.000Z (over 9 years ago)
- Default Branch: master
- Last Pushed: 2023-10-25T21:28:33.000Z (about 1 year ago)
- Last Synced: 2024-05-21T07:21:07.004Z (6 months ago)
- Topics: mxnet
- Language: C++
- Homepage: https://mxnet.apache.org
- Size: 96.6 MB
- Stars: 20,721
- Watchers: 1,069
- Forks: 6,813
- Open Issues: 2,010
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Codeowners: CODEOWNERS
- Security: SECURITY.md
Awesome Lists containing this project
- awesome-julia-datasciences - MXNet - Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Go, Javascript and more. (APL / General-Purpose Machine Learning)
- Awesome-Distributed-Deep-Learning - MXNet - Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Go, Javascript and more. (Frameworks / **[Contributing](#contributing)** -->)
README
[![banner](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/banner.png)](https://mxnet.apache.org)
Apache MXNet for Deep Learning
===========================================
[![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/apache/mxnet)](https://github.com/apache/mxnet/releases) [![GitHub stars](https://img.shields.io/github/stars/apache/mxnet)](https://github.com/apache/mxnet/stargazers) [![GitHub forks](https://img.shields.io/github/forks/apache/mxnet)](https://github.com/apache/mxnet/network) [![GitHub contributors](https://img.shields.io/github/contributors-anon/apache/mxnet)](https://github.com/apache/mxnet/graphs/contributors) [![GitHub issues](https://img.shields.io/github/issues/apache/mxnet)](https://github.com/apache/mxnet/issues) [![good first issue](https://img.shields.io/github/issues/apache/mxnet/good%20first%20issue)](https://github.com/apache/mxnet/labels/good%20first%20issue) [![GitHub pull requests by-label](https://img.shields.io/github/issues-pr/apache/mxnet/pr-awaiting-review)](https://github.com/apache/mxnet/labels/pr-awaiting-review) [![GitHub license](https://img.shields.io/github/license/apache/mxnet)](https://github.com/apache/mxnet/blob/master/LICENSE) [![Twitter](https://img.shields.io/twitter/url?style=social&url=https%3A%2F%2Fgithub.com%2Fapache%2Fmxnet)](https://twitter.com/intent/tweet?text=Wow:%20https%3A%2F%2Fgithub.com%2Fapache%2Fmxnet%20@ApacheMXNet) [![Twitter Follow](https://img.shields.io/twitter/follow/ApacheMXNet?style=social)](https://twitter.com/ApacheMXNet)Apache MXNet is a deep learning framework designed for both *efficiency* and *flexibility*.
It allows you to ***mix*** [symbolic and imperative programming](https://mxnet.apache.org/api/architecture/program_model)
to ***maximize*** efficiency and productivity.
At its core, MXNet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly.
A graph optimization layer on top of that makes symbolic execution fast and memory efficient.
MXNet is portable and lightweight, scalable to many GPUs and machines.Apache MXNet is more than a deep learning project. It is a [community](https://mxnet.apache.org/versions/master/community)
on a mission of democratizing AI. It is a collection of [blue prints and guidelines](https://mxnet.apache.org/api/architecture/overview)
for building deep learning systems, and interesting insights of DL systems for hackers.Licensed under an [Apache-2.0](https://github.com/apache/mxnet/blob/master/LICENSE) license.
| Branch | Build Status |
|:-------:|:-------------:|
| [master](https://github.com/apache/mxnet/tree/master) | [![CentOS CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-cpu/job/master/badge/icon?subject=build%20centos%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-cpu/job/master/) [![CentOS GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-gpu/job/master/badge/icon?subject=build%20centos%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-gpu/job/master/) [![Clang Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/clang/job/master/badge/icon?subject=build%20clang)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/clang/job/master/)
[![Edge Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/edge/job/master/badge/icon?subject=build%20edge)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/edge/job/master/) [![Miscellaneous Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/miscellaneous/job/master/badge/icon?subject=build%20miscellaneous)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/miscellaneous/job/master/) [![Sanity Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/sanity/job/master/badge/icon?subject=build%20sanity)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/sanity/job/master/)
[![Unix CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-cpu/job/master/badge/icon?subject=build%20unix%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-cpu/job/master/) [![Unix GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-gpu/job/master/badge/icon?subject=build%20unix%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-gpu/job/master/) [![Website Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/website/job/master/badge/icon?subject=build%20website)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/website/job/master/)
[![Windows CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-cpu/job/master/badge/icon?subject=build%20windows%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-cpu/job/master/) [![Windows GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-gpu/job/master/badge/icon?subject=build%20windows%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-gpu/job/master/) [![Documentation Status](http://jenkins.mxnet-ci.com/job/restricted-website-build/badge/icon)](https://mxnet.apache.org/) |
| [v1.x](https://github.com/apache/mxnet/tree/v1.x) | [![CentOS CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-cpu/job/v1.x/badge/icon?subject=build%20centos%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-cpu/job/v1.x/) [![CentOS GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-gpu/job/v1.x/badge/icon?subject=build%20centos%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/centos-gpu/job/v1.x/) [![Clang Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/clang/job/v1.x/badge/icon?subject=build%20clang)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/clang/job/v1.x/)
[![Edge Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/edge/job/v1.x/badge/icon?subject=build%20edge)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/edge/job/v1.x/) [![Miscellaneous Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/miscellaneous/job/v1.x/badge/icon?subject=build%20miscellaneous)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/miscellaneous/job/v1.x/) [![Sanity Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/sanity/job/v1.x/badge/icon?subject=build%20sanity)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/sanity/job/v1.x/)
[![Unix CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-cpu/job/v1.x/badge/icon?subject=build%20unix%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-cpu/job/v1.x/) [![Unix GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-gpu/job/v1.x/badge/icon?subject=build%20unix%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/unix-gpu/job/v1.x/) [![Website Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/website/job/v1.x/badge/icon?subject=build%20website)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/website/job/v1.x/)
[![Windows CPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-cpu/job/v1.x/badge/icon?subject=build%20windows%20cpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-cpu/job/v1.x/) [![Windows GPU Build Status](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-gpu/job/v1.x/badge/icon?subject=build%20windows%20gpu)](http://jenkins.mxnet-ci.com/job/mxnet-validation/job/windows-gpu/job/v1.x/) [![Documentation Status](http://jenkins.mxnet-ci.com/job/restricted-website-build/badge/icon)](https://mxnet.apache.org/) |Features
--------
* NumPy-like programming interface, and is integrated with the new, easy-to-use Gluon 2.0 interface. NumPy users can easily adopt MXNet and start in deep learning.
* Automatic hybridization provides imperative programming with the performance of traditional symbolic programming.
* Lightweight, memory-efficient, and portable to smart devices through native cross-compilation support on ARM, and through ecosystem projects such as [TVM](https://tvm.ai), [TensorRT](https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html), [OpenVINO](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html).
* Scales up to multi GPUs and distributed setting with auto parallelism through [ps-lite](https://github.com/dmlc/ps-lite), [Horovod](https://github.com/horovod/horovod), and [BytePS](https://github.com/bytedance/byteps).
* Extensible backend that supports full customization, allowing integration with custom accelerator libraries and in-house hardware without the need to maintain a fork.
* Support for [Python](https://mxnet.apache.org/api/python), [Java](https://mxnet.apache.org/api/java), [C++](https://mxnet.apache.org/api/cpp), [R](https://mxnet.apache.org/api/r), [Scala](https://mxnet.apache.org/api/scala), [Clojure](https://mxnet.apache.org/api/clojure), [Go](https://github.com/jdeng/gomxnet/), [Javascript](https://github.com/dmlc/mxnet.js/), [Perl](https://mxnet.apache.org/api/perl), and [Julia](https://mxnet.apache.org/api/julia).
* Cloud-friendly and directly compatible with AWS and Azure.Contents
--------
* [Installation](https://mxnet.apache.org/get_started)
* [Tutorials](https://mxnet.apache.org/api/python/docs/tutorials/)
* [Ecosystem](https://mxnet.apache.org/ecosystem)
* [API Documentation](https://mxnet.apache.org/api)
* [Examples](https://github.com/apache/mxnet-examples)
* [Stay Connected](#stay-connected)
* [Social Media](#social-media)What's New
----------
* [1.9.1 Release](https://github.com/apache/mxnet/releases/tag/1.9.1) - MXNet 1.9.1 Release.
* [1.8.0 Release](https://github.com/apache/mxnet/releases/tag/1.8.0) - MXNet 1.8.0 Release.
* [1.7.0 Release](https://github.com/apache/mxnet/releases/tag/1.7.0) - MXNet 1.7.0 Release.
* [1.6.0 Release](https://github.com/apache/mxnet/releases/tag/1.6.0) - MXNet 1.6.0 Release.
* [1.5.1 Release](https://github.com/apache/mxnet/releases/tag/1.5.1) - MXNet 1.5.1 Patch Release.
* [1.5.0 Release](https://github.com/apache/mxnet/releases/tag/1.5.0) - MXNet 1.5.0 Release.
* [1.4.1 Release](https://github.com/apache/mxnet/releases/tag/1.4.1) - MXNet 1.4.1 Patch Release.
* [1.4.0 Release](https://github.com/apache/mxnet/releases/tag/1.4.0) - MXNet 1.4.0 Release.
* [1.3.1 Release](https://github.com/apache/mxnet/releases/tag/1.3.1) - MXNet 1.3.1 Patch Release.
* [1.3.0 Release](https://github.com/apache/mxnet/releases/tag/1.3.0) - MXNet 1.3.0 Release.
* [1.2.0 Release](https://github.com/apache/mxnet/releases/tag/1.2.0) - MXNet 1.2.0 Release.
* [1.1.0 Release](https://github.com/apache/mxnet/releases/tag/1.1.0) - MXNet 1.1.0 Release.
* [1.0.0 Release](https://github.com/apache/mxnet/releases/tag/1.0.0) - MXNet 1.0.0 Release.
* [0.12.1 Release](https://github.com/apache/mxnet/releases/tag/0.12.1) - MXNet 0.12.1 Patch Release.
* [0.12.0 Release](https://github.com/apache/mxnet/releases/tag/0.12.0) - MXNet 0.12.0 Release.
* [0.11.0 Release](https://github.com/apache/mxnet/releases/tag/0.11.0) - MXNet 0.11.0 Release.
* [Apache Incubator](http://incubator.apache.org/projects/mxnet.html) - We are now an Apache Incubator project.
* [0.10.0 Release](https://github.com/apache/mxnet/releases/tag/v0.10.0) - MXNet 0.10.0 Release.
* [0.9.3 Release](./docs/architecture/release_note_0_9.md) - First 0.9 official release.
* [0.9.1 Release (NNVM refactor)](./docs/architecture/release_note_0_9.md) - NNVM branch is merged into master now. An official release will be made soon.
* [0.8.0 Release](https://github.com/apache/mxnet/releases/tag/v0.8.0)### Ecosystem News
* [oneDNN for Faster CPU Performance](docs/python_docs/python/tutorials/performance/backend/dnnl/dnnl_readme.md)
* [MXNet Memory Monger, Training Deeper Nets with Sublinear Memory Cost](https://github.com/dmlc/mxnet-memonger)
* [Tutorial for NVidia GTC 2016](https://github.com/dmlc/mxnet-gtc-tutorial)
* [MXNet.js: Javascript Package for Deep Learning in Browser (without server)](https://github.com/dmlc/mxnet.js/)
* [Guide to Creating New Operators (Layers)](https://mxnet.apache.org/api/faq/new_op)
* [Go binding for inference](https://github.com/songtianyi/go-mxnet-predictor)Stay Connected
--------------| Channel | Purpose |
|---|---|
| [Follow MXNet Development on Github](https://github.com/apache/mxnet/issues) | See what's going on in the MXNet project. |
| [MXNet Confluence Wiki for Developers](https://cwiki.apache.org/confluence/display/MXNET/Apache+MXNet+Home) | MXNet developer wiki for information related to project development, maintained by contributors and developers. To request write access, send an email to [send request to the dev list](mailto:[email protected]?subject=Requesting%20CWiki%20write%20access) . |
| [[email protected] mailing list](https://lists.apache.org/[email protected]) | The "dev list". Discussions about the development of MXNet. To subscribe, send an email to [[email protected]](mailto:[email protected]) . |
| [discuss.mxnet.io](https://discuss.mxnet.io) | Asking & answering MXNet usage questions. |
| [Apache Slack #mxnet Channel](https://the-asf.slack.com/archives/C7FN4FCP9) | Connect with MXNet and other Apache developers. To join the MXNet slack channel [send request to the dev list](mailto:[email protected]?subject=Requesting%20slack%20access) . |
| [Follow MXNet on Social Media](#social-media) | Get updates about new features and events. |### Social Media
Keep connected with the latest MXNet news and updates.
Contributor and user blogs about MXNet
History
-------
MXNet emerged from a collaboration by the authors of [cxxnet](https://github.com/dmlc/cxxnet), [minerva](https://github.com/dmlc/minerva), and [purine2](https://github.com/purine/purine2). The project reflects what we have learned from the past projects. MXNet combines aspects of each of these projects to achieve flexibility, speed, and memory efficiency.Tianqi Chen, Mu Li, Yutian Li, Min Lin, Naiyan Wang, Minjie Wang, Tianjun Xiao,
Bing Xu, Chiyuan Zhang, and Zheng Zhang.
[MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems](https://github.com/dmlc/web-data/raw/master/mxnet/paper/mxnet-learningsys.pdf).
In Neural Information Processing Systems, Workshop on Machine Learning Systems, 2015