{"id":21315290,"url":"https://github.com/futurecomputing4ai/hadamard-derived-linear-binding","last_synced_at":"2025-07-04T01:34:58.334Z","repository":{"id":261066355,"uuid":"872565807","full_name":"FutureComputing4AI/Hadamard-derived-Linear-Binding","owner":"FutureComputing4AI","description":"A Walsh Hadamard Derived Linear Vector Symbolic Architecture :fire:","archived":false,"fork":false,"pushed_at":"2024-11-04T13:48:03.000Z","size":62,"stargazers_count":3,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-22T10:29:14.104Z","etag":null,"topics":["fourier-transform","hadamard","hadamard-derived-linear-binding","hadamard-transforms","hlb","holographic-reduced-representations","hrr","linear-binding","vector-symbolic-architecture"],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2410.22669","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/FutureComputing4AI.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-10-14T16:55:20.000Z","updated_at":"2025-01-18T11:52:35.000Z","dependencies_parsed_at":"2024-11-04T14:38:55.356Z","dependency_job_id":"2fce8a90-f00b-4505-86b1-2143464a26d3","html_url":"https://github.com/FutureComputing4AI/Hadamard-derived-Linear-Binding","commit_stats":null,"previous_names":["futurecomputing4ai/hadamard-derived-linear-binding"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FutureComputing4AI%2FHadamard-derived-Linear-Binding","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FutureComputing4AI%2FHadamard-derived-Linear-Binding/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FutureComputing4AI%2FHadamard-derived-Linear-Binding/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FutureComputing4AI%2FHadamard-derived-Linear-Binding/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/FutureComputing4AI","download_url":"https://codeload.github.com/FutureComputing4AI/Hadamard-derived-Linear-Binding/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243791379,"owners_count":20348453,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["fourier-transform","hadamard","hadamard-derived-linear-binding","hadamard-transforms","hlb","holographic-reduced-representations","hrr","linear-binding","vector-symbolic-architecture"],"created_at":"2024-11-21T18:18:51.897Z","updated_at":"2025-03-15T21:25:46.061Z","avatar_url":"https://github.com/FutureComputing4AI.png","language":"Python","readme":"\u003ch2 align=\"center\"\u003eHadamard-derived linear Binding (HLB)\u003c/h2\u003e\n\n\u003cp align=\"justify\"\u003e\nVector Symbolic Architectures (VSAs) are one approach to developing Neuro-symbolic AI, where two vectors in $\\mathbb{R}^d$ are 'bound' together to produce a new vector in the same space. VSAs support the commutativity and associativity of this binding operation, along with an inverse operation, allowing one to construct symbolic-style manipulations over real-valued vectors. Most VSAs were developed before deep learning and automatic differentiation became popular and instead focused on efficacy in hand-designed systems. In this work, we introduce the Hadamard-derived linear Binding (HLB), which is designed to have favorable computational efficiency, efficacy in classic VSA tasks, and perform well in differentiable systems.\n\u003c/p\u003e\n\n### Requirements\n\nCSPS\n\n```properties\nconda create --name csps python=3.9 -y \u0026\u0026 conda activate csps\n```\n\n- [PyTorch](https://pytorch.org/get-started/locally/) v1.13.1+cu116\n\nXML\n\n```properties\nconda create --name xml python=3.9 -y \u0026\u0026 conda activate xml\n```\n\n- [PyTorch](https://pytorch.org/get-started/locally/) v2.0.1+cu118\n- [Pyxclib](https://github.com/kunaldahiya/pyxclib)\n- ```pip install cython==3.0.10```\n- ```pip install tabulate ```\n\nClassical VSA Tasks\n\n- [Torchhd](https://torchhd.readthedocs.io/en/stable/) ```pip install torch-hd```\n\n### Datasets\n\n* CSPS\n    - MNIST, SVHN, CIFAR10, and CIFAR100 are the standard datasets that come with PyTorch library. MiniImageNet can be\n      downloaded from [Kaggle](https://www.kaggle.com/datasets/arjunashok33/miniimagenet).\n* XML\n    - For XML experiments pre-processed features are used. All the datasets can be downloaded from the\n      benchmark [website](http://manikvarma.org/downloads/XC/XMLRepository.html).\n\n### Code\n\nThe code is organized in two folders. CSPS code is in the ```CSPS Exp/``` folder and XML code is in the ```XML Exp/```\nfolder. For both of them, individual folders are used by the name of the dataset. For example, network, and training\nfiles related to CIFAR10 datasets are the ```cifar10/``` subfolder, and so on. We have compared the proposed HLB method\nwith HRR, VTB, and MAP vector symbolic architectures. Results of HRR are taken from previous papers. Our training code\nfile name for VTB, MAP, and HLB methods are named as $train_{method name}.py$\n\nSimilarly, the ```XML Exp/``` folder contains subfolders for each dataset containing training files. Once again, the\ntraining files are named as $train_{method name}.py$ Two types of data loader are used. ```dataset_fast.py``` is a fast\ndataloader for smaller datasets (Bibtex, Mediamill, Delicious). ```dataset_sparse.py``` is a dataloader for loading\nlarger data files. Code regarding the classical VSA tasks is in the ```Classical VSA Tasks/``` folder.\n\n### Citations\n\u003cimg src=\"https://github.com/user-attachments/assets/81faccca-ab19-45ee-b33d-8782205f7704\" width=200\u003e\n\n[![Paper](https://img.shields.io/badge/NeurIPS-2024-7a09d6.svg?longCache=true\u0026style=flat)](https://neurips.cc/virtual/2024/poster/93583)\n[![Paper](https://img.shields.io/badge/paper-ArXiv-ff0a0a.svg?longCache=true\u0026style=flat)](https://arxiv.org/abs/2410.22669)\n\nTo get more information about the proposed method and experiments, please go through the [paper](https://arxiv.org/abs/2410.22669). If you use this work or find this useful, cite the paper as:\n\n```bibtex\n@article{alam2024walsh,\n  title={A Walsh Hadamard Derived Linear Vector Symbolic Architecture},\n  author={Alam, Mohammad Mahmudul and Oberle, Alexander and Raff, Edward and Biderman, Stella and Oates, Tim and Holt, James},\n  journal={arXiv preprint arXiv:2410.22669},\n  year={2024}\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffuturecomputing4ai%2Fhadamard-derived-linear-binding","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffuturecomputing4ai%2Fhadamard-derived-linear-binding","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffuturecomputing4ai%2Fhadamard-derived-linear-binding/lists"}