{"id":24632809,"url":"https://github.com/codewithdark-git/titans-transformer","last_synced_at":"2026-04-11T02:35:01.912Z","repository":{"id":272765434,"uuid":"917687551","full_name":"codewithdark-git/titans-transformer","owner":"codewithdark-git","description":"This repository contains an experimental implementation of the Titans Transformer architecture for sequence modeling tasks. The code is a personal exploration and may include errors or inefficiencies as I am currently in the learning stage. It is inspired by the ideas presented in the original","archived":false,"fork":false,"pushed_at":"2025-01-17T09:54:19.000Z","size":24,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-25T08:12:57.709Z","etag":null,"topics":["deep-learning","deep-neural-networks","inference","llm","ml","neural-networks","new","nn","paper","python","research-paper","test","titans","transformer","transformers-models"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/codewithdark-git.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-01-16T13:13:49.000Z","updated_at":"2025-01-17T18:17:38.000Z","dependencies_parsed_at":"2025-01-16T14:41:55.983Z","dependency_job_id":"a681866c-d8a6-4425-a1c0-93da8e32541d","html_url":"https://github.com/codewithdark-git/titans-transformer","commit_stats":null,"previous_names":["codewithdark-git/titans_paper_implementation","codewithdark-git/titans-transformer"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codewithdark-git%2Ftitans-transformer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codewithdark-git%2Ftitans-transformer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codewithdark-git%2Ftitans-transformer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codewithdark-git%2Ftitans-transformer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/codewithdark-git","download_url":"https://codeload.github.com/codewithdark-git/titans-transformer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":244566934,"owners_count":20473451,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","deep-neural-networks","inference","llm","ml","neural-networks","new","nn","paper","python","research-paper","test","titans","transformer","transformers-models"],"created_at":"2025-01-25T08:13:00.613Z","updated_at":"2025-10-30T05:03:55.404Z","avatar_url":"https://github.com/codewithdark-git.png","language":"Jupyter Notebook","readme":"# Titans Transformer Implementation\n\nThis repository contains an experimental implementation of the **Titans Transformer** architecture for sequence modeling tasks. The code is a personal exploration and may include errors or inefficiencies as I am currently in the learning stage. It is inspired by the ideas presented in the original **Titans Transformer** paper, and I highly recommend referring to the paper for accurate and detailed information.\n\n## Overview\nThe **Titans Transformer** introduces a novel architecture designed to enhance long-sequence modeling by incorporating a memory mechanism. This repository includes:\n\n1. A custom implementation of the **Titans Transformer**.\n2. Benchmarking code comparing the **Titans Transformer** with a standard Transformer.\n3. Training and evaluation scripts on the Wikitext-2 dataset.\n4. Visualization of benchmark results.\n\n\u003e **Note**: This repository is for educational and experimental purposes only and is not a production-ready implementation.\n\n---\n\n## Features\n\n- **Titans Transformer Architecture**: Implements memory mechanisms for improved sequence modeling.\n- **Standard Transformer**: Baseline implementation of the original Transformer for comparison.\n- **Benchmarking**: Evaluates inference time and perplexity across different sequence lengths.\n- **Training**: Customizable training loop with data preprocessing, batching, and evaluation.\n\n---\n\n## Prerequisites\n\n### Dependencies\n\nEnsure you have the following installed:\n\n- Python 3.8+\n- PyTorch\n- Datasets\n- Matplotlib\n\nYou can install the required libraries using:\n```bash\npip install torch datasets matplotlib\n```\n\n---\n\n## Usage\n\n### Clone the Repository\n```bash\ngit clone https://github.com/codewithdark-git/titans-transformer.git\ncd titans-transformer\n```\n\n### Run Training\nTo train the Titans Transformer model on the Wikitext-2 dataset, execute the training script:\n```bash\npython train_titans_transformer.py\n```\n\n### Benchmark Models\nTo compare the Titans Transformer and Standard Transformer, run:\n```bash\npython benchmark_transformers.py\n```\nThis will generate a plot of inference time and perplexity for different sequence lengths.\n\n---\n\n## Results\nThe repository includes a benchmarking script to compare:\n\n- **Inference Time**: The time taken to process a batch of sequences.\n- **Perplexity**: A measure of the model's ability to predict the next token in a sequence.\n\nThe results are visualized in a plot saved as `benchmark_results.png`.\n\n---\n\n## Disclaimer\n\nThis implementation is an educational attempt to experiment with the Titans Transformer. **It is not guaranteed to be error-free or optimized**. Please refer to the original paper for accurate and detailed information. I am in the learning phase, and this project is part of my journey to better understand advanced Transformer architectures.\n\nFeedback and suggestions for improvement are always welcome!\n\n---\n\n## References\n- Original Paper: [Titans Transformer](https://arxiv.org/abs/2501.00663)\n\n---\n\n## Contact\nFeel free to reach out if you have any questions or feedback:\n\n---\n\n### Thank You for Visiting!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodewithdark-git%2Ftitans-transformer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcodewithdark-git%2Ftitans-transformer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodewithdark-git%2Ftitans-transformer/lists"}