{"id":13472554,"url":"https://github.com/openai/transformer-debugger","last_synced_at":"2025-05-14T18:00:27.113Z","repository":{"id":227175353,"uuid":"770650220","full_name":"openai/transformer-debugger","owner":"openai","description":null,"archived":false,"fork":false,"pushed_at":"2024-06-04T00:21:06.000Z","size":1023,"stargazers_count":4075,"open_issues_count":9,"forks_count":244,"subscribers_count":26,"default_branch":"main","last_synced_at":"2025-04-10T04:53:38.434Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/openai.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-03-11T23:06:25.000Z","updated_at":"2025-04-09T14:48:34.000Z","dependencies_parsed_at":"2024-09-21T10:30:32.800Z","dependency_job_id":"edd59908-b9b4-45cd-b3ce-530e6b8035f7","html_url":"https://github.com/openai/transformer-debugger","commit_stats":null,"previous_names":["openai/transformer-debugger"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Ftransformer-debugger","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Ftransformer-debugger/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Ftransformer-debugger/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Ftransformer-debugger/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/openai","download_url":"https://codeload.github.com/openai/transformer-debugger/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254198452,"owners_count":22030964,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-07-31T16:00:55.680Z","updated_at":"2025-05-14T18:00:27.042Z","avatar_url":"https://github.com/openai.png","language":"Python","funding_links":[],"categories":["Python","others","Projects","Explainability and Fairness","Tools","Mechanistic interpretability libraries"],"sub_categories":["👩🏽‍💻 Develop Assistant","Interpretability/Explicability"],"readme":"# Transformer Debugger\n\nTransformer Debugger (TDB) is a tool developed by OpenAI's [Superalignment\nteam](https://openai.com/blog/introducing-superalignment) with the goal of\nsupporting investigations into specific behaviors of small language models. The tool combines\n[automated interpretability](https://openai.com/research/language-models-can-explain-neurons-in-language-models)\ntechniques with [sparse autoencoders](https://transformer-circuits.pub/2023/monosemantic-features).\n\nTDB enables rapid exploration before needing to write code, with the ability to intervene in the\nforward pass and see how it affects a particular behavior. It can be used to answer questions like,\n\"Why does the model output token A instead of token B for this prompt?\" or \"Why does attention head\nH attend to token T for this prompt?\" It does so by identifying specific components (neurons,\nattention heads, autoencoder latents) that contribute to the behavior, showing automatically\ngenerated explanations of what causes those components to activate most strongly, and tracing\nconnections between components to help discover circuits.\n\nThese videos give an overview of TDB and show how it can be used to investigate [indirect object\nidentification in GPT-2 small](https://arxiv.org/abs/2211.00593):\n\n- [Introduction](https://www.loom.com/share/721244075f12439496db5d53439d2f84?sid=8445200e-c49e-4028-8b8e-3ea8d361dec0)\n- [Neuron viewer pages](https://www.loom.com/share/21b601b8494b40c49b8dc7bfd1dc6829?sid=ee23c00a-9ede-4249-b9d7-c2ba15993556)\n- [Example: Investigating name mover heads, part 1](https://www.loom.com/share/3478057cec484a1b85471585fef10811?sid=b9c3be4b-7117-405a-8d31-0f9e541dcfb6)\n- [Example: Investigating name mover heads, part 2](https://www.loom.com/share/6bd8c6bde84b42a98f9a26a969d4a3ad?sid=4a09ac29-58a2-433e-b55d-762414d9a7fa)\n\n## What's in the release?\n\n- [Neuron viewer](neuron_viewer/README.md): A React app that hosts TDB as well as pages with information about individual model components (MLP neurons, attention heads and autoencoder latents for both).\n- [Activation server](neuron_explainer/activation_server/README.md): A backend server that performs inference on a subject model to provide data for TDB. It also reads and serves data from public Azure buckets.\n- [Models](neuron_explainer/models/README.md): A simple inference library for GPT-2 models and their autoencoders, with hooks to grab activations.\n- [Collated activation datasets](datasets.md): top-activating dataset examples for MLP neurons, attention heads and autoencoder latents.\n\n## Setup\n\nFollow these steps to install the repo.  You'll first need python/pip, as well as node/npm.\n\nThough optional, we recommend you use a virtual environment or equivalent:\n\n```sh\n# If you're already in a venv, deactivate it.\ndeactivate\n# Create a new venv.\npython -m venv ~/.virtualenvs/transformer-debugger\n# Activate the new venv.\nsource ~/.virtualenvs/transformer-debugger/bin/activate\n```\n\nOnce your environment is set up, follow the following steps:\n```sh\ngit clone git@github.com:openai/transformer-debugger.git\ncd transformer-debugger\n\n# Install neuron_explainer\npip install -e .\n\n# Set up the pre-commit hooks.\npre-commit install\n\n# Install neuron_viewer.\ncd neuron_viewer\nnpm install\ncd ..\n```\n\nTo run the TDB app, you'll then need to follow the instructions to set up the [activation server backend](neuron_explainer/activation_server/README.md) and [neuron viewer frontend](neuron_viewer/README.md).\n\n## Making changes\n\nTo validate changes:\n\n- Run `pytest`\n- Run `mypy --config=mypy.ini .`\n- Run activation server and neuron viewer and confirm that basic functionality like TDB and neuron\n  viewer pages is still working\n\n\n## Links\n\n- [Terminology](terminology.md)\n\n## How to cite\n\nPlease cite as:\n\n```\nMossing, et al., “Transformer Debugger”, GitHub, 2024.\n```\n\nBibTex citation:\n\n```\n@misc{mossing2024tdb,\n  title={Transformer Debugger},\n  author={Mossing, Dan and Bills, Steven and Tillman, Henk and Dupré la Tour, Tom and Cammarata, Nick and Gao, Leo and Achiam, Joshua and Yeh, Catherine and Leike, Jan and Wu, Jeff and Saunders, William},\n  year={2024},\n  publisher={GitHub},\n  howpublished={\\url{https://github.com/openai/transformer-debugger}},\n}\n```\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Ftransformer-debugger","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fopenai%2Ftransformer-debugger","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Ftransformer-debugger/lists"}