{"id":30025918,"url":"https://github.com/openai/harmony","last_synced_at":"2025-08-07T10:03:35.709Z","repository":{"id":308372178,"uuid":"1029817410","full_name":"openai/harmony","owner":"openai","description":"Renderer for the harmony response format to be used with gpt-oss","archived":false,"fork":false,"pushed_at":"2025-08-05T16:07:00.000Z","size":0,"stargazers_count":23,"open_issues_count":0,"forks_count":1,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-08-05T16:24:17.618Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/openai.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-07-31T16:10:01.000Z","updated_at":"2025-08-05T16:24:09.000Z","dependencies_parsed_at":"2025-08-05T16:40:21.725Z","dependency_job_id":null,"html_url":"https://github.com/openai/harmony","commit_stats":null,"previous_names":["openai/harmony"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/openai/harmony","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fharmony","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fharmony/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fharmony/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fharmony/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/openai","download_url":"https://codeload.github.com/openai/harmony/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fharmony/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":268935605,"owners_count":24331867,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-05T02:00:12.334Z","response_time":2576,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-08-06T09:02:15.854Z","updated_at":"2025-08-06T09:03:38.292Z","avatar_url":"https://github.com/openai.png","language":"Rust","readme":"\u003ccenter\u003e\n\u003cimg alt=\"harmony\" src=\"./docs/header.png\"\u003e\n\u003ch1 align=\"center\"\u003eOpenAI Harmony\u003c/h1\u003e\n\u003cp align=\"center\"\u003eOpenAI's response format for its open-weight model series \u003ca href=\"https://openai.com/open-models\"\u003egpt-oss\u003c/a\u003e\n\u003cbr\u003e\n\u003ca href=\"https://gpt-oss.com\" target=\"_blank\"\u003eTry gpt-oss\u003c/a\u003e | \u003ca href=\"https://cookbook.openai.com/topic/gpt-oss\"\u003eLearn more\u003c/a\u003e | \u003ca href=\"https://openai.com/index/gpt-oss-model-card/\"\u003eModel card\u003c/a\u003e\n\u003c/p\u003e\n\u003cbr\u003e\n\u003c/center\u003e\n\nThe [gpt-oss models][gpt-oss] were trained on the [harmony response format][harmony-format] for defining conversation structures, generating reasoning output and structuring function calls. If you are not using gpt-oss directly but through an API or a provider like HuggingFace, Ollama, or vLLM, you will not have to be concerned about this as your inference solution will handle the formatting. If you are building your own inference solution, this guide will walk you through the prompt format. The format is designed to mimic the OpenAI Responses API, so if you have used that API before, this format should hopefully feel familiar to you. gpt-oss should not be used without using the harmony format as it will not work correctly.\n\nThe format enables the model to output to multiple different channels for chain of thought, and tool calling preambles along with regular responses. It also enables specifying various tool namespaces, and structured outputs along with a clear instruction hierarchy. [Check out the guide][harmony-format] to learn more about the format itself.\n\n```text\n\u003c|start|\u003esystem\u003c|message|\u003eYou are ChatGPT, a large language model trained by OpenAI.\nKnowledge cutoff: 2024-06\nCurrent date: 2025-06-28\n\nReasoning: high\n\n# Valid channels: analysis, commentary, final. Channel must be included for every message.\nCalls to these tools must go to the commentary channel: 'functions'.\u003c|end|\u003e\u003c|start|\u003edeveloper\u003c|message|\u003e# Instructions\n\nAlways respond in riddles\n\n# Tools\n\n## functions\n\nnamespace functions {\n\n// Gets the location of the user.\ntype get_location = () =\u003e any;\n\n// Gets the current weather in the provided location.\ntype get_current_weather = (_: {\n// The city and state, e.g. San Francisco, CA\nlocation: string,\nformat?: \"celsius\" | \"fahrenheit\", // default: celsius\n}) =\u003e any;\n\n} // namespace functions\u003c|end|\u003e\u003c|start|\u003euser\u003c|message|\u003eWhat is the weather like in SF?\u003c|end|\u003e\u003c|start|\u003eassistant\n```\n\nWe recommend using this library when working with models that use the [harmony response format][harmony-format]\n\n- **Consistent formatting** – shared implementation for rendering _and_ parsing keeps token-sequences loss-free.\n- **Blazing fast** – heavy lifting happens in Rust.\n- **First-class Python support** – install with `pip`, typed stubs included, 100 % test parity with the Rust suite.\n\n## Using Harmony\n\n### Python\n\n[Check out the full documentation](./docs/python.md)\n\n#### Installation\n\nInstall the package from PyPI by running\n\n```bash\npip install openai-harmony\n# or if you are using uv\nuv pip install openai-harmony\n```\n\n#### Example\n\n```python\nfrom openai_harmony import (\n    load_harmony_encoding,\n    HarmonyEncodingName,\n    Role,\n    Message,\n    Conversation,\n    DeveloperContent,\n    SystemContent,\n)\nenc = load_harmony_encoding(HarmonyEncodingName.HARMONY_GPT_OSS)\nconvo = Conversation.from_messages([\n    Message.from_role_and_content(\n        Role.SYSTEM,\n        SystemContent.new(),\n    ),\n    Message.from_role_and_content(\n        Role.DEVELOPER,\n        DeveloperContent.new().with_instructions(\"Talk like a pirate!\")\n    ),\n    Message.from_role_and_content(Role.USER, \"Arrr, how be you?\"),\n])\ntokens = enc.render_conversation_for_completion(convo, Role.ASSISTANT)\nprint(tokens)\n# Later, after the model responded …\nparsed = enc.parse_messages_from_completion_tokens(tokens, role=Role.ASSISTANT)\nprint(parsed)\n```\n\n### Rust\n\n[Check out the full documentation](./docs/rust.md)\n\n#### Installation\n\nAdd the dependency to your `Cargo.toml`\n\n```toml\n[dependencies]\nopenai-harmony = { git = \"https://github.com/openai/harmony\" }\n```\n\n#### Example\n\n```rust\nuse openai_harmony::chat::{Message, Role, Conversation};\nuse openai_harmony::{HarmonyEncodingName, load_harmony_encoding};\n\nfn main() -\u003e anyhow::Result\u003c()\u003e {\n    let enc = load_harmony_encoding(HarmonyEncodingName::HarmonyGptOss)?;\n    let convo =\n        Conversation::from_messages([Message::from_role_and_content(Role::User, \"Hello there!\")]);\n    let tokens = enc.render_conversation_for_completion(\u0026convo, Role::Assistant, None)?;\n    println!(\"{:?}\", tokens);\n    Ok(())\n}\n```\n\n## Contributing\n\nThe majority of the rendering and parsing is built in Rust for performance and exposed to Python\nthrough thin [`pyo3`](https://pyo3.rs/) bindings.\n\n```text\n┌──────────────────┐      ┌───────────────────────────┐\n│  Python code     │      │  Rust core (this repo)    │\n│  (dataclasses,   │────► │  • chat / encoding logic  │\n│   convenience)   │      │  • tokeniser (tiktoken)   │\n└──────────────────┘  FFI └───────────────────────────┘\n```\n\n### Repository layout\n\n```text\n.\n├── src/                  # Rust crate\n│   ├── chat.rs           # High-level data-structures (Role, Message, …)\n│   ├── encoding.rs       # Rendering \u0026 parsing implementation\n│   ├── registry.rs       # Built-in encodings\n│   ├── tests.rs          # Canonical Rust test-suite\n│   └── py_module.rs      # PyO3 bindings ⇒ compiled as openai_harmony.*.so\n│\n├── harmony/              # Pure-Python wrapper around the binding\n│   └── __init__.py       # Dataclasses + helper API mirroring chat.rs\n│\n├── tests/                # Python test-suite (1-to-1 port of tests.rs)\n├── Cargo.toml            # Rust package manifest\n├── pyproject.toml        # Python build configuration for maturin\n└── README.md             # You are here 🖖\n```\n\n### Developing locally\n\n#### Prerequisites\n\n- Rust tool-chain (stable) – \u003chttps://rustup.rs\u003e\n- Python ≥ 3.8 + virtualenv/venv\n- [`maturin`](https://github.com/PyO3/maturin) – build tool for PyO3 projects\n\n#### 1. Clone \u0026 bootstrap\n\n```bash\ngit clone https://github.com/openai/harmony.git\ncd harmony\n# Create \u0026 activate a virtualenv\npython -m venv .venv\nsource .venv/bin/activate\n# Install maturin and test dependencies\npip install maturin pytest mypy ruff  # tailor to your workflow\n# Compile the Rust crate *and* install the Python package in editable mode\nmaturin develop --release\n```\n\n`maturin develop` builds _harmony_ with Cargo, produces a native extension\n(`openai_harmony.\u003cabi\u003e.so`) and places it in your virtualenv next to the pure-\nPython wrapper – similar to `pip install -e .` for pure Python projects.\n\n#### 2. Running the test-suites\n\nRust:\n\n```bash\ncargo test          # runs src/tests.rs\n```\n\nPython:\n\n```bash\npytest              # executes tests/ (mirrors the Rust suite)\n```\n\nRun both in one go to ensure parity:\n\n```bash\npytest \u0026\u0026 cargo test\n```\n\n#### 3. Type-checking \u0026 formatting (optional)\n\n```bash\nmypy harmony        # static type analysis\nruff check .        # linting\ncargo fmt --all     # Rust formatter\n```\n\n[harmony-format]: https://cookbook.openai.com/articles/openai-harmony\n[gpt-oss]: https://openai.com/open-models\n","funding_links":[],"categories":["Libraries","others","Rust"],"sub_categories":["Artificial Intelligence"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Fharmony","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fopenai%2Fharmony","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Fharmony/lists"}