{"id":17242583,"url":"https://github.com/robinweser/tokenize-sync","last_synced_at":"2025-08-08T08:33:10.531Z","repository":{"id":57377757,"uuid":"94636631","full_name":"robinweser/tokenize-sync","owner":"robinweser","description":"Simple synchronous string tokenizer using Regex","archived":false,"fork":false,"pushed_at":"2017-11-13T21:32:45.000Z","size":36,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-06-21T22:35:42.313Z","etag":null,"topics":["compiler","synchronous","tokenization","tokenizer","tokenizing-parser"],"latest_commit_sha":null,"homepage":null,"language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/robinweser.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-06-17T16:45:02.000Z","updated_at":"2021-09-30T21:11:16.000Z","dependencies_parsed_at":"2022-09-19T02:51:22.576Z","dependency_job_id":null,"html_url":"https://github.com/robinweser/tokenize-sync","commit_stats":null,"previous_names":["rofrischmann/tokenize-sync"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/robinweser/tokenize-sync","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/robinweser%2Ftokenize-sync","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/robinweser%2Ftokenize-sync/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/robinweser%2Ftokenize-sync/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/robinweser%2Ftokenize-sync/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/robinweser","download_url":"https://codeload.github.com/robinweser/tokenize-sync/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/robinweser%2Ftokenize-sync/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":261760885,"owners_count":23205839,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["compiler","synchronous","tokenization","tokenizer","tokenizing-parser"],"created_at":"2024-10-15T06:13:35.247Z","updated_at":"2025-06-24T21:39:07.726Z","avatar_url":"https://github.com/robinweser.png","language":"JavaScript","readme":"# Synchronous Tokenizer\n\nA simple synchronous string tokenizer using Regex.\n\n\u003cimg alt=\"TravisCI\" src=\"https://travis-ci.org/rofrischmann/tokenize-sync.svg?branch=master\"\u003e \u003ca href=\"https://codeclimate.com/github/rofrischmann/tokenize-sync/coverage\"\u003e\u003cimg alt=\"Test Coverage\" src=\"https://codeclimate.com/github/rofrischmann/tokenize-sync/badges/coverage.svg\"\u003e\u003c/a\u003e \u003cimg alt=\"npm downloads\" src=\"https://img.shields.io/npm/dm/tokenize-sync.svg\"\u003e \u003cimg alt=\"gzipped size\" src=\"https://img.shields.io/badge/gzipped-0.5kb-brightgreen.svg\"\u003e \u003cimg alt=\"npm version\" src=\"https://badge.fury.io/js/tokenize-sync.svg\"\u003e\n\n## Installation\n```sh\nyarn add tokenize-sync\n```\n\n## Usage\n\nThe package ships a single `tokenize` function that takes an (*string*) input and a (*Object*) ruleMap that maps (*string*) token names to Regexes.\n\n```javascript\nimport tokenize from 'tokenize-sync'\n\nconst ruleMap = {\n  identifier: /^[a-z-]+$/i,\n  number: /^\\d+$/,\n  whitespace: /^\\s+$/\n}\n\nconst input = 'test 12  foobar3'\n\nconst tokens = tokenize(input, ruleMap)\n\ntokens === [{\n  type: 'identifier',\n  value: 'test',\n  start: 0,\n  end: 4\n}, {\n  type: 'whitespace',\n  value: ' ',\n  start: 4,\n  end: 5\n}, {\n  type: 'number',\n  value: '12',\n  start: 5,\n  end: 7\n}, {\n  type: 'whitespace',\n  value: '  ',\n  start: 7,\n  end: 9\n}, {\n  type: 'identifier',\n  value: 'foobar',\n  start: 9,\n  end: 15\n},  {\n  type: 'number',\n  value: '3',\n  start: 15,\n  end: 16\n}]\n```\n\n## License\ntokenize-sync is licensed under the [MIT License](http://opensource.org/licenses/MIT).\u003cbr\u003e\nDocumentation is licensed under [Creative Common License](http://creativecommons.org/licenses/by/4.0/).\u003cbr\u003e\nCreated with ♥ by [@rofrischmann](http://rofrischmann.de).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frobinweser%2Ftokenize-sync","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Frobinweser%2Ftokenize-sync","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frobinweser%2Ftokenize-sync/lists"}