Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/patrick-kidger/signatory
Differentiable computations of the signature and logsignature transforms, on both CPU and GPU. (ICLR 2021)
https://github.com/patrick-kidger/signatory
deep-learning deep-neural-networks logsignature logsignatures machine-learning pytorch rough-paths signature signatures
Last synced: 5 days ago
JSON representation
Differentiable computations of the signature and logsignature transforms, on both CPU and GPU. (ICLR 2021)
- Host: GitHub
- URL: https://github.com/patrick-kidger/signatory
- Owner: patrick-kidger
- License: apache-2.0
- Created: 2019-06-11T18:43:15.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2024-01-11T16:33:10.000Z (12 months ago)
- Last Synced: 2024-12-13T20:11:18.418Z (12 days ago)
- Topics: deep-learning, deep-neural-networks, logsignature, logsignatures, machine-learning, pytorch, rough-paths, signature, signatures
- Language: C++
- Homepage:
- Size: 1.4 MB
- Stars: 263
- Watchers: 12
- Forks: 35
- Open Issues: 20
-
Metadata Files:
- Readme: README.rst
- Changelog: CHANGELOG.txt
- Contributing: .github/CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
|Signatory|
###########.. |Signatory| image:: https://raw.githubusercontent.com/patrick-kidger/signatory/master/docs/_static/signatory.png
Differentiable computations of the signature and logsignature transforms, on both CPU and GPU.
What is the signature transform?
################################
The *signature transform* is roughly analogous to the Fourier transform, in that it operates on a stream of data (often a time series). Whilst the Fourier transform extracts information about frequency, the signature transform extracts information about *order* and *area*. Furthermore (and unlike the Fourier transform), order and area represent all possible nonlinear effects: the signature transform is a *universal nonlinearity*, meaning that every continuous function of the input stream may be approximated arbitrary well by a *linear* function of its signature. If you're doing machine learning then you probably understand why this is such a desirable property!Besides this, the signature transform has many other nice properties -- robustness to missing or irregularly sampled data; optional translation invariance; optional sampling invariance. Furthermore it can be used to encode certain physical quantities, and may be used for data compression.
Check out `this `__ for a primer on the use of the signature transform in machine learning, just as a feature transformation, and `this `__ for a more in-depth look at integrating the signature transform into neural networks.
Installation
############.. code-block:: bash
pip install signatory==. --no-cache-dir --force-reinstall
where ```` is the version of Signatory you would like to download (the most recent version is 1.2.7) and ```` is the version of PyTorch you are using.
Available for Python 3.7--3.9 on Linux and Windows. Requires `PyTorch `__ 1.8.0--1.11.0.
(If you need it, then previous versions of Signatory included support for older versions of Python, PyTorch, and MacOS, see `here `__.)
After installation, just ``import signatory`` inside Python.
Take care **not** to run ``pip install signatory``, as this will likely download the wrong version.
Example:
--------For example, if you are using PyTorch 1.11.0 and want Signatory 1.2.7, then you should run:
.. code-block:: bash
pip install signatory==1.2.7.1.11.0 --no-cache-dir --force-reinstall
Why you need to specify all of this:
------------------------------------Yes, this looks a bit odd. This is needed to work around `limitations of PyTorch `__ and `pip `__.
The ``--no-cache-dir --force-reinstall`` flags are because ``pip`` doesn't expect to need to care about versions quite as much as this, so it will sometimes erroneously use inappropriate caches if not told otherwise.
Installation from source is also possible; please consult the `documentation `__. This also includes information on how to run the tests and benchmarks.
If you have any problems with installation then check the `FAQ `__. If that doesn't help then feel free to `open an issue `__.
Documentation
#############
The documentation is available `here `__.Example
#######
Usage is straightforward. As a simple example,.. code-block:: python
import signatory
import torch
batch, stream, channels = 1, 10, 2
depth = 4
path = torch.rand(batch, stream, channels)
signature = signatory.signature(path, depth)
# signature is a PyTorch tensorFor further examples, see the `documentation `__.
Citation
########
If you found this library useful in your research, please consider citing `the paper `__... code-block:: bibtex
@inproceedings{kidger2021signatory,
title={{S}ignatory: differentiable computations of the signature and logsignature transforms, on both {CPU} and {GPU}},
author={Kidger, Patrick and Lyons, Terry},
booktitle={International Conference on Learning Representations},
year={2021},
note={\url{https://github.com/patrick-kidger/signatory}}
}