https://github.com/danieldk/wordpieces
Split tokens into word pieces
https://github.com/danieldk/wordpieces
piece rust tokenization word wordpiece
Last synced: 28 days ago
JSON representation
Split tokens into word pieces
- Host: GitHub
- URL: https://github.com/danieldk/wordpieces
- Owner: danieldk
- License: other
- Created: 2019-11-22T10:36:17.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2022-10-10T14:57:20.000Z (over 2 years ago)
- Last Synced: 2025-04-07T02:50:31.140Z (about 1 month ago)
- Topics: piece, rust, tokenization, word, wordpiece
- Language: Rust
- Homepage:
- Size: 33.2 KB
- Stars: 5
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE-APACHE
Awesome Lists containing this project
README
# wordpieces
This crate provides a subword tokenizer. A subword tokenizer splits a
token into several pieces, so-called *word pieces*. Word pieces were
popularized by and used in the
[BERT](https://arxiv.org/abs/1810.04805) natural language encoder.