Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/openpeeps/toktok
Generic tokenizer written in Nim language 👑 Powered by std/lexbase and Nim's Macros
https://github.com/openpeeps/toktok
awesome-nim generic-library hacktoberfest lex lexer lexer-generator lexical nim nim-lang nim-language parser programming-language tokenizer tokens
Last synced: 14 days ago
JSON representation
Generic tokenizer written in Nim language 👑 Powered by std/lexbase and Nim's Macros
- Host: GitHub
- URL: https://github.com/openpeeps/toktok
- Owner: openpeeps
- License: mit
- Created: 2022-02-18T06:56:04.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2024-04-18T18:31:13.000Z (10 months ago)
- Last Synced: 2024-11-20T19:36:18.209Z (3 months ago)
- Topics: awesome-nim, generic-library, hacktoberfest, lex, lexer, lexer-generator, lexical, nim, nim-lang, nim-language, parser, programming-language, tokenizer, tokens
- Language: Nim
- Homepage: https://openpeeps.github.io/toktok/
- Size: 504 KB
- Stars: 31
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Generic tokenizer written in Nim language, powered by Nim's Macros 👑
nimble install toktok
## 😍 Key Features
- ✨ Powered by Nim's Macros
- 🪄 Based on `std/lexbase` / Zero Regular Expression
- Compile-time generation using macro-based **TokenKind** `enum`, `lexbase`
- Runtime generation using **TokenKind** `tables`, `lexbase`
- Open Source | `MIT`> [!NOTE]
> This is a generic Lexer, based on std/ `streams`, `lexbase` and `macros`. It is meant to be used by higher level parsers for writing any kind of tools or programs.> [!NOTE]
> Compile with `-d:toktokdebug` to inspect the generated code.## Quick Example
```nim
# Register your custom handlers
handlers:
proc handleImport(lex: var Lexer, kind: TokenKind) =
# tokenize `import x, y, z`
lex.kind = kind# Register your tokens
registerTokens defaultSettings:
`const` = "const"
`echo` = "hello"
asgn = '=': # `=` is tkAsgn
eq = '=' # `==` is tkEQ
excl = '!':
ne = '='
at = '@':
import = tokenizer(handleImport, "import")# Tokenizing...
var
tok = lexer.init(sample)
prev: TokenTuple
curr: TokenTuple = tok.getToken
next: TokenTuple = tok.getTokenproc walk(tok: var Lexer) =
prev = curr
curr = next
next = tok.getTokenwhile likely(curr.kind != tkEOF):
if tok.hasError: break
echo curr # use `getToken` consumer to get token by token
walk tok
```## TODO
- [ ] Runtime Token generation using tables/critbits### ❤ Contributions & Support
- 🐛 Found a bug? [Create a new Issue](https://github.com/openpeeps/toktok/issues)
- 👋 Wanna help? [Fork it!](https://github.com/openpeeps/toktok/fork)
- 😎 [Get €20 in cloud credits from Hetzner](https://hetzner.cloud/?ref=Hm0mYGM9NxZ4)
- 🥰 [Donate to OpenPeeps via PayPal address](https://www.paypal.com/donate/?hosted_button_id=RJK3ZTDWPL55C)### 🎩 License
`MIT` license. [Made by Humans from OpenPeeps](https://github.com/openpeeps).
Copyright © 2023 OpenPeeps & Contributors — All rights reserved.