{"id":14958164,"url":"https://github.com/tensorly/torch","last_synced_at":"2025-12-12T01:02:55.239Z","repository":{"id":37056162,"uuid":"320250939","full_name":"tensorly/torch","owner":"tensorly","description":"TensorLy-Torch: Deep Tensor Learning with TensorLy and PyTorch","archived":false,"fork":false,"pushed_at":"2024-06-09T17:34:47.000Z","size":4000,"stargazers_count":77,"open_issues_count":5,"forks_count":18,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-05-11T02:08:41.241Z","etag":null,"topics":["deep-learning","deep-neural-networks","deep-tensor-learning","factorized-cnns","factorized-convolution","factorized-networks","pytorch","tensor","tensor-contraction","tensor-convolution-networks","tensor-learning","tensor-networks","tensor-regression","tensor-regression-layers","tensorized-networks"],"latest_commit_sha":null,"homepage":"http://tensorly.org/torch/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/tensorly.png","metadata":{"files":{"readme":"README.rst","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-12-10T11:26:46.000Z","updated_at":"2025-04-22T08:30:53.000Z","dependencies_parsed_at":"2024-06-09T18:23:04.255Z","dependency_job_id":"d1e56a25-2755-49f5-af90-9d6867137479","html_url":"https://github.com/tensorly/torch","commit_stats":null,"previous_names":["tensorly/tensorly-torch"],"tags_count":4,"template":false,"template_full_name":null,"purl":"pkg:github/tensorly/torch","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorly%2Ftorch","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorly%2Ftorch/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorly%2Ftorch/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorly%2Ftorch/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/tensorly","download_url":"https://codeload.github.com/tensorly/torch/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorly%2Ftorch/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":261795334,"owners_count":23210620,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","deep-neural-networks","deep-tensor-learning","factorized-cnns","factorized-convolution","factorized-networks","pytorch","tensor","tensor-contraction","tensor-convolution-networks","tensor-learning","tensor-networks","tensor-regression","tensor-regression-layers","tensorized-networks"],"created_at":"2024-09-24T13:16:24.109Z","updated_at":"2025-12-12T01:02:50.220Z","avatar_url":"https://github.com/tensorly.png","language":"Python","readme":".. image:: https://badge.fury.io/py/tensorly-torch.svg\n    :target: https://badge.fury.io/py/tensorly-torch\n\n\n==============\nTensorLy-Torch\n==============\n\nTensorLy-Torch is a Python library for deep tensor networks that\nbuilds on top of `TensorLy \u003chttps://github.com/tensorly/tensorly/\u003e`_\nand `PyTorch \u003chttps://pytorch.org/\u003e`_.\nIt allows to easily leverage tensor methods in a deep learning setting and comes with all batteries included.\n\n- **Website:** http://tensorly.org/torch/\n- **Source-code:**  https://github.com/tensorly/torch\n\n\nWith TensorLy-Torch, you can easily: \n\n- **Tensor Factorizations**: decomposing, manipulating and initializing tensor decompositions can be tricky. We take care of it all, in a convenient, unified API.\n- **Leverage structure in your data**: with tensor layers, you can easily leverage the structure in your data, through Tensor Regression Layers, Factorized Convolutions, etc\n- **Built-in tensor layers**: all you have to do is import tensorly torch and include the layers we provide directly within your PyTorch models!\n- **Tensor hooks**: you can easily augment your architectures with our built-in Tensor Hooks. Robustify your network with Tensor Dropout and automatically select the rank end-to-end with L1 Regularization!\n- **All the methods available**: we are always adding more methods to make it easy to compare between the performance of various deep tensor based methods!\n\nDeep Tensorized Learning\n========================\n\nTensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. \nIn fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.\nTensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. \n\n.. image:: ./doc/_static/tensorly-torch-pyramid.png\n\nTensorLy is a Python library that aims at making tensor learning simple and accessible.\nIt provides a high-level API for tensor methods, including core tensor operations, tensor decomposition and regression. \nIt has a flexible backend that allows running operations seamlessly using NumPy, PyTorch, TensorFlow, JAX, MXNet and CuPy.\n \n**TensorLy-Torch** is a PyTorch only library that builds on top of TensorLy and provides out-of-the-box tensor layers.\n\nImprove your neural networks with tensor methods\n------------------------------------------------\n\nTensor methods generalize matrix algebraic operations to higher-orders. Deep neural networks typically map between higher-order tensors. \nIn fact, it is the ability of deep convolutional neural networks to preserve and leverage local structure that, along with large datasets and efficient hardware, made the current levels of performance possible.\nTensor methods allow to further leverage and preserve that structure, for individual layers or whole networks. \n\nIn TensorLy-Torch, we provide convenient layers that do all the heavy lifting for you \nand provide the benefits tensor based layers wrapped in a nice, well documented and tested API.\n\nFor instance, convolution layers of any order (2D, 3D or more), can be efficiently parametrized\nusing tensor decomposition. Using a CP decomposition results in a separable convolution\nand you can replace your original convolution with a series of small efficient ones: \n\n.. image:: ./doc/_static/cp-conv.png \n\nThese can be easily perform with FactorizedConv in TensorLy-Torch.\nWe also have Tucker convolutions and new tensor-train convolutions!\nWe also implement various other methods such as tensor regression and contraction layers, \ntensorized linear layers, tensor dropout and more!\n\n\nInstalling TensorLy-Torch\n=========================\n\nThrough pip\n-----------\n\n.. code:: \n\n   pip install tensorly-torch\n   \n   \nFrom source\n-----------\n\n.. code::\n\n  git clone https://github.com/tensorly/torch\n  cd torch\n  pip install -e .\n  \n\n\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftensorly%2Ftorch","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftensorly%2Ftorch","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftensorly%2Ftorch/lists"}