{"id":19773668,"url":"https://github.com/praeclarum/transformers-js","last_synced_at":"2025-09-03T22:37:34.675Z","repository":{"id":66303142,"uuid":"525985381","full_name":"praeclarum/transformers-js","owner":"praeclarum","description":"Browser-compatible JS library for running language models","archived":false,"fork":false,"pushed_at":"2022-08-25T20:44:44.000Z","size":1490,"stargazers_count":228,"open_issues_count":6,"forks_count":17,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-07-08T05:00:36.572Z","etag":null,"topics":["ai","browser","huggingface","javascript","ml","transformers","translation"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/praeclarum.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null},"funding":{"github":"praeclarum"}},"created_at":"2022-08-17T23:25:42.000Z","updated_at":"2025-04-07T23:56:47.000Z","dependencies_parsed_at":"2023-02-20T21:01:09.090Z","dependency_job_id":null,"html_url":"https://github.com/praeclarum/transformers-js","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/praeclarum/transformers-js","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/praeclarum%2Ftransformers-js","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/praeclarum%2Ftransformers-js/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/praeclarum%2Ftransformers-js/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/praeclarum%2Ftransformers-js/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/praeclarum","download_url":"https://codeload.github.com/praeclarum/transformers-js/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/praeclarum%2Ftransformers-js/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":273523339,"owners_count":25120861,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-03T02:00:09.631Z","response_time":76,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","browser","huggingface","javascript","ml","transformers","translation"],"created_at":"2024-11-12T05:10:36.781Z","updated_at":"2025-09-03T22:37:34.662Z","avatar_url":"https://github.com/praeclarum.png","language":"JavaScript","readme":"# Huggingface Transformers Running in the Browser\n\n[![Azure Static Web Apps CI/CD](https://github.com/praeclarum/transformers-js/actions/workflows/azure-static-web-apps-gentle-desert-0ddc8ce10.yml/badge.svg)](https://github.com/praeclarum/transformers-js/actions/workflows/azure-static-web-apps-gentle-desert-0ddc8ce10.yml)\n\nThis library enables you to run huggingface transformer models directly in the browser.\nIt accomplishes this by running the models using the\n[ONNX Runtime JavaScript API](https://github.com/microsoft/onnxruntime/tree/main/js)\nand by implementing its own JavaScript-only tokenization library.\n\nAt the moment, it is compatible with Google's T5 models, but it was designed to be expanded.\nI hope to support GPT2, Roberta, and InCoder in the future.\n\n\n## Live Demo\n\n[https://transformers-js.praeclarum.org](https://transformers-js.praeclarum.org)\n\nThis demo is a static website hosted on [Azure Static Web Apps](https://azure.microsoft.com/en-us/services/app-service/static/).\nNo code is executed on the server. Instead, the neural network is downloaded and executed in the browser.\n\nSee the [Makefile](Makefile) `demo` rule to see how the demo is built.\n\n\n## Usage\n\nThis example shows how to use the library to load the T5 neural network to translate from English to French.\n\n```js\n// Load the tokenizer and model.\nconst tokenizer = await AutoTokenizer.fromPretrained(\"t5-small\", \"/models\");\nconst model = await AutoModelForSeq2SeqLM.fromPretrained(\"t5-small\", \"/models\");\n\n// Translate \"Hello, world!\"\nconst english = \"Hello, world!\";\nconst inputTokenIds = tokenizer.encode(\"translate English to French: \" + english);\nconst outputTokenIds = await model.generate(inputTokenIds, {maxLength:50,topK:10});\nconst french = tokenizer.decode(outputTokenIds, true);\nconsole.log(french); // \"Bonjour monde!\"\n```\n\nTo run this demo, you need to have converted the model to ONNX format using the [Model Conversion Tool](#model-converter).\n\n```bash\npython3 tools/convert_model.py t5-small models\n```\n\n\n## Library\n\nThe library contains several components:\n\n1. [Tokenizers](#tokenizers) to load and execute pretrained tokenizers from their huggingface JSON representation.\n2. [Transformers](#transformers) to load and execute pretrained models from their ONNX representation.\n3. [Model Converter](#model-converter) to convert huggingface models to ONNX to be served by your static web server.\n\n\n### Tokenizers\n\n[tokenizers.js](src/tokenizers.js)\n\n\n### Transformers\n\n[transformers.js](src/transformers.js)\n\n\n#### Models\n\nCurrently only the *T5* network is supported.\n\n\n#### Sampling\n\nThe neural network outputs the logarithm of the probability of each token.\nIn order to get a token, a probabilistic sample has to be taken.\nThe following algorithms are implemented:\n\n* *Greedy*: Take the token with the highest probability.\n* *Top-k*: Take the top-k tokens with the highest probability.\n\n\n### Model Converter\n\nThe ONNX Runtime for the Web is used to run models in the browser.\n\nYou can run the conversion from the command line:\n\n```bash\npython3 tools/convert_model.py \u003cmodelid\u003e \u003coutputdir\u003e \u003cquantize\u003e \u003ctestinput\u003e\n```\n\nFor example:\n\n```bash\npython3 tools/convert_model.py praeclarum/cuneiform ./models true \"Translate Akkadian to English: lugal\"\n```\n\nOr you can run it from Python:\n\n```python\nfrom convert_model import t5_to_onnx\n\nonnx_model = t5_to_onnx(\"t5-small\", output_dir=\"./models\", quantized=True)\n```\n\n**Developer Note:** The model conversion script is a thin wrapper over the amazing\n[fastT5](https://github.com/Ki6an/fastT5) library by @Ki6an.\nThe wrapper exists because I hope to support more model types in the future.\n","funding_links":["https://github.com/sponsors/praeclarum"],"categories":["JavaScript"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpraeclarum%2Ftransformers-js","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpraeclarum%2Ftransformers-js","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpraeclarum%2Ftransformers-js/lists"}