{"id":26045529,"url":"https://github.com/NanboLi/FACTS","last_synced_at":"2025-03-07T20:01:10.328Z","repository":{"id":259859496,"uuid":"879655254","full_name":"NanboLi/FACTS","owner":"NanboLi","description":"[ICLR 2025] Implementation of \"FACTS: A Factored State-Space Framework For World Modelling\"","archived":false,"fork":false,"pushed_at":"2025-03-06T09:31:04.000Z","size":254,"stargazers_count":8,"open_issues_count":0,"forks_count":0,"subscribers_count":3,"default_branch":"main","last_synced_at":"2025-03-06T10:26:49.923Z","etag":null,"topics":["artificial-intelligence","deep-learning-architecture","machine-learning","neural-networks","world-modeling"],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2410.20922","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/NanboLi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-10-28T10:10:31.000Z","updated_at":"2025-03-06T09:31:07.000Z","dependencies_parsed_at":"2024-10-28T13:35:14.447Z","dependency_job_id":"7a86aa1b-1ee6-4de2-b4cd-b798a1f0e301","html_url":"https://github.com/NanboLi/FACTS","commit_stats":null,"previous_names":["nanboli/facts"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NanboLi%2FFACTS","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NanboLi%2FFACTS/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NanboLi%2FFACTS/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NanboLi%2FFACTS/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/NanboLi","download_url":"https://codeload.github.com/NanboLi/FACTS/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":242454561,"owners_count":20130807,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["artificial-intelligence","deep-learning-architecture","machine-learning","neural-networks","world-modeling"],"created_at":"2025-03-07T20:00:42.123Z","updated_at":"2025-03-07T20:01:10.314Z","avatar_url":"https://github.com/NanboLi.png","language":"Python","readme":"# FACTS\nPyTorch Implementation of \"[FACTS: A Factored State-Space Framework For World Modelling](https://arxiv.org/abs/2410.20922)\" (accepted at **ICLR 2025**)\n\n![](assets/FACTS.jpg?raw=true)\n\n## Installation:\n1. Install your dependencies, especially [pytorch](https://pytorch.org/):\n   ```\n        'pytorch\u003e=2.1.0'\n        'einops\u003e=0.8.0'\n   ```\n    You may find [Conda](https://docs.conda.io/projects/conda/en/stable/user-guide/getting-started.html) a useful tool for managing your virtual environment!\n2. In a terminal, ```cd``` to ```FACTS/```, and run ```pip install -e .```\n3. (Optional) Test your installation:\\\n    Under ```FACTS/```, run\n    ```\n        . demos/scripts/installation_test.sh\n    ```\n    If you see `Good to go!`, you are Good to go!\n\n## Usage:\nWe provide only three examples to show its usage, for now, more details and [DEMOS](#demos) will be released later. Stay tuned, until then... \n* Example 1 (very basic):\n```\n    import torch\n    from facts_ssm import FACTS\n\n    facts=FACTS(\n        in_features=32,\n        in_factors=128,  # M\n        num_factors=8,  # K\n        slot_size=32,  # D\n    ).to('cuda')\n\n    X = torch.randn(4, 30, 128, 32).to('cuda')  # [batch, seq_len, M, D]\n    y, z = facts(X)   # [batch, seq_len, K, D], [batch, seq_len, K, D]\n    print(f\"Output y:  {y.size()}\")\n    print(f\"Output z:  {z.size()}\")\n```  \n\n* Example 2 (flexible customisation):\n```\n    import torch\n    from facts_ssm import FACTS\n\n    facts=FACTS(\n        in_features=32,\n        in_factors=128,  # M\n        num_factors=128,  # K\n        slot_size=32,  # D\n        num_heads=4,  # multi-head FACTS\n        dropout=0.1,  # dropout\n        C_rank=32,  # set to D to use the C proj in SSMs\n        router='sfmx_attn',  # router customisation  \n        init_method='learnable',  # choose a RNN memory init method\n        slim_mode=True,  # Turn on to save ~25% params\n        residual=True  # Support only M==K\n    ).to('cuda')\n\n    X = torch.randn(4, 30, 128, 32).to('cuda')  # [batch, seq_len, M, D]\n    y, z = facts(X)   # [batch, seq_len, K, D], [batch, seq_len, K, D]\n    print(f\"Output y:  {y.size()}\")\n    print(f\"Output z:  {z.size()}\")\n```\n\n\n* Example 3 (semi-parallel RNNs, i.e. chunking):\n```\n    import torch\n    from facts_ssm import FACTS\n\n    facts=FACTS(\n        in_features=32,\n        in_factors=128,  # M\n        num_factors=128,  # K\n        slot_size=32,  # D\n        num_heads=4,  \n        dropout=0.1,  \n        C_rank=32,  # set to D to allow output proj. C\n        fast_mode=False,  # mute full-length parallel scan\n        chunk_size=16  # parallel within the chunk, sequential across chunks\n    ).to('cuda')\n\n    X = torch.randn(4, 30, 128, 32).to('cuda')  # [batch, seq_len, M, D]\n    y, z = facts(X)   # [batch, seq_len, K, D], [batch, seq_len, K, D]\n    print(f\"Output y:  {y.size()}\")\n    print(f\"Output z:  {z.size()}\")\n```\n\n## Demos:\n1. See [Multivariate Time Series Forecasting (MTSF)](./facts_ssm/demos/time_series/readme.md)\n2. coming soon ...\n\n\n## Contact\nWe constantly respond to the raised ''issues'' in terms of running the code. For further inquiries and discussions (e.g. questions about the paper), email: linanbo2008@gmail.com.\n\n\n## Citation\nIf you find this code useful, please reference in your paper:\n```\n@article{nanbo2024facts,\n  title={FACTS: A Factored State-Space Framework For World Modelling},\n  author={Nanbo, Li and Laakom, Firas and Xu, Yucheng and Wang, Wenyi and Schmidhuber, J{\\\"u}rgen},\n  journal={arXiv preprint arXiv:2410.20922},\n  year={2024}\n}\n```\n","funding_links":[],"categories":["ICLR 2025"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FNanboLi%2FFACTS","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FNanboLi%2FFACTS","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FNanboLi%2FFACTS/lists"}