{"id":13716872,"url":"https://github.com/locuslab/pytorch_fft","last_synced_at":"2025-04-05T14:04:07.694Z","repository":{"id":57457908,"uuid":"92265243","full_name":"locuslab/pytorch_fft","owner":"locuslab","description":"PyTorch wrapper for FFTs","archived":false,"fork":false,"pushed_at":"2018-10-28T18:14:53.000Z","size":115,"stargazers_count":315,"open_issues_count":14,"forks_count":47,"subscribers_count":8,"default_branch":"master","last_synced_at":"2025-03-29T13:08:58.317Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/locuslab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-05-24T07:50:44.000Z","updated_at":"2024-11-23T03:40:36.000Z","dependencies_parsed_at":"2022-09-07T03:41:56.874Z","dependency_job_id":null,"html_url":"https://github.com/locuslab/pytorch_fft","commit_stats":null,"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Fpytorch_fft","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Fpytorch_fft/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Fpytorch_fft/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/locuslab%2Fpytorch_fft/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/locuslab","download_url":"https://codeload.github.com/locuslab/pytorch_fft/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247345850,"owners_count":20924102,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-03T00:01:15.247Z","updated_at":"2025-04-05T14:04:07.671Z","avatar_url":"https://github.com/locuslab.png","language":"Python","readme":"# A PyTorch wrapper for CUDA FFTs [![License][license-image]][license]\n\n[license-image]: http://img.shields.io/badge/license-Apache--2-blue.svg?style=flat\n[license]: LICENSE\n\n*A package that provides a PyTorch C extension for performing batches of 2D CuFFT \ntransformations, by [Eric Wong](https://github.com/riceric22)*\n\nUpdate: FFT functionality is now officially in PyTorch 0.4, see the \ndocumentation [here](https://pytorch.org/docs/0.4.0/torch.html?highlight=fft#torch.fft). \nThis repository is only useful for older versions of PyTorch, and will no longer \nbe updated. \n\n## Installation\n\nThis package is on PyPi. Install with `pip install pytorch-fft`. \n\n## Usage\n\n+ From the `pytorch_fft.fft` module, you can use the following to do \nfoward and backward FFT transformations (complex to complex)\n  + `fft` and `ifft` for 1D transformations\n  + `fft2` and `ifft2` for 2D transformations\n  + `fft3` and `ifft3` for 3D transformations\n+ From the same module, you can also use the following for \nreal to complex / complex to real FFT transformations\n  + `rfft` and `irfft` for 1D transformations\n  + `rfft2` and `irfft2` for 2D transformations\n  + `rfft3` and `irfft3` for 3D transformations\n+ For an `d`-D transformation, the input tensors are required to have \u003e= (d+1)\n  dimensions (n1 x ... x nk x m1 x ... x md) where `n1 x ... x nk` is the\n  batch of FFT transformations, and `m1 x ... x md` are the dimensions of the\n  `d`-D transformation. `d` must be a number from 1 to 3.\n+ Finally, the module contains the following helper functions you may find\nuseful\n  + `reverse(X, group_size=1)` reverses the elements of a tensor and returns\n    the result in a new tensor. Note that PyTorch does not current support\n    negative slicing, see this\n    [issue](https://github.com/pytorch/pytorch/issues/229). If a group size is\n    supplied, the elements will be reversed in groups of that size.\n  + `expand(X, imag=False, odd=True)` takes a tensor output of a real 2D or 3D\n    FFT and expands it with its redundant entries to match the output of a\n    complex FFT.\n+ For autograd support, use the following functions in the\n`pytorch_fft.fft.autograd` module: \n  + `Fft` and `Ifft` for 1D transformations\n  + `Fft2d` and `Ifft2d` for 2D transformations\n  + `Fft3d` and `Ifft3d` for 3D transformations\n\n\n```Python\n# Example that does a batch of three 2D transformations of size 4 by 5. \nimport torch\nimport pytorch_fft.fft as fft\n\nA_real, A_imag = torch.randn(3,4,5).cuda(), torch.zeros(3,4,5).cuda()\nB_real, B_imag = fft.fft2(A_real, A_imag)\nfft.ifft2(B_real, B_imag) # equals (A, zeros)\n\nB_real, B_imag = fft.rfft2(A) # is a truncated version which omits\n                                   # redundant entries\n\nreverse(torch.arange(0,6)) # outputs [5,4,3,2,1,0]\nreverse(torch.arange(0,6), 2) # outputs [4,5,2,3,0,1]\n\nexpand(B_real) # is equivalent to  fft.fft2(A, zeros)[0]\nexpand(B_imag, imag=True) # is equivalent to  fft.fft2(A, zeros)[1]\n```\n\n\n```Python\n# Example that uses the autograd for 2D fft:\nimport torch\nfrom torch.autograd import Variable\nimport pytorch_fft.fft.autograd as fft\nimport numpy as np\n\nf = fft.Fft2d()\ninvf= fft.Ifft2d()\n\nfx, fy = (Variable(torch.arange(0,100).view((1,1,10,10)).cuda(), requires_grad=True), \n          Variable(torch.zeros(1, 1, 10, 10).cuda(),requires_grad=True))\nk1,k2 = f(fx,fy)\nz = k1.sum() + k2.sum()\nz.backward()\nprint(fx.grad, fy.grad)\n```\n\n## Notes\n+ This follows NumPy semantics and behavior, so `ifft2(fft2(x)) = x`. Note\n  that CuFFT semantics for inverse FFT only flip the sign of the transform,\n  but it is not a true inverse.\n+ Similarly, the real to complex / complex to real variants also follow NumPy\n  semantics and behavior. In the 1D case, this means that for an input of size\n  `N`, it returns an output of size `N//2+1` (it omits redundant entries, see\n  the [Numpy docs](https://docs.scipy.org/doc/numpy/reference/generated/numpy.fft.rfft.html))\n+ The functions in the `pytorch_fft.fft` module do not implement the PyTorch\n  autograd `Function`, and are semantically and functionally like their numpy\n  equivalents.\n+ Autograd functionality is in the `pytorch_fft.fft.autograd` module.\n\n## Repository contents\n- pytorch_fft/src: C source code\n- pytorch_fft/fft: Python convenience wrapper\n- build.py: compilation file\n- test.py: tests against NumPy FFTs and Autograd checks\n\n## Issues and Contributions\n\nIf you have any issues or feature requests, \n[file an issue](https://github.com/bamos/block/issues)\nor [send in a PR](https://github.com/bamos/block/pulls). \n\n","funding_links":[],"categories":["Pytorch \u0026 related libraries｜Pytorch \u0026 相关库","Pytorch \u0026 related libraries"],"sub_categories":["Other libraries｜其他库:","Other libraries:"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flocuslab%2Fpytorch_fft","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flocuslab%2Fpytorch_fft","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flocuslab%2Fpytorch_fft/lists"}