Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/openxla/xla
A machine learning compiler for GPUs, CPUs, and ML accelerators
https://github.com/openxla/xla
Last synced: 3 days ago
JSON representation
A machine learning compiler for GPUs, CPUs, and ML accelerators
- Host: GitHub
- URL: https://github.com/openxla/xla
- Owner: openxla
- License: apache-2.0
- Created: 2022-08-09T15:30:01.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-10-29T09:17:30.000Z (about 2 months ago)
- Last Synced: 2024-10-29T09:23:44.807Z (about 2 months ago)
- Language: C++
- Homepage:
- Size: 211 MB
- Stars: 2,668
- Watchers: 41
- Forks: 427
- Open Issues: 2,179
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
- awesome-approximate-dnn - OpenXLA - XLA (Accelerated Linear Algebra) is an open-source machine learning (ML) compiler for GPUs, CPUs, and ML accelerators. The XLA compiler takes models from popular ML frameworks such as PyTorch, TensorFlow, and JAX, and optimizes them for high-performance execution across different hardware platforms including GPUs, CPUs, and ML accelerators. (Tools / Graph Compiler)
- StarryDivineSky - openxla/xla
README
# XLA
XLA (Accelerated Linear Algebra) is an open-source machine learning (ML)
compiler for GPUs, CPUs, and ML accelerators.
The XLA compiler takes models from popular ML frameworks such as PyTorch,
TensorFlow, and JAX, and optimizes them for high-performance execution across
different hardware platforms including GPUs, CPUs, and ML accelerators.## Get started
If you want to use XLA to compile your ML project, refer to the corresponding
documentation for your ML framework:* [PyTorch](https://pytorch.org/xla)
* [TensorFlow](https://www.tensorflow.org/xla)
* [JAX](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html)If you're not contributing code to the XLA compiler, you don't need to clone and
build this repo. Everything here is intended for XLA contributors who want to
develop the compiler and XLA integrators who want to debug or add support for ML
frontends and hardware backends.## Contribute
If you'd like to contribute to XLA, review
[How to Contribute](docs/contributing.md) and then see the
[developer guide](docs/developer_guide.md).## Contacts
* For questions, contact the maintainers - maintainers at openxla.org
## Resources
* [Community Resources](https://github.com/openxla/community)
## Code of Conduct
While under TensorFlow governance, all community spaces for SIG OpenXLA are
subject to the
[TensorFlow Code of Conduct](https://github.com/tensorflow/tensorflow/blob/master/CODE_OF_CONDUCT.md).