tiktoken-rs
Text tokenizer library
Provides a Rust library for tokenizing text with OpenAI models using tiktoken.
Ready-made tokenizer library for working with GPT and tiktoken
266 stars
5 watching
49 forks
Language: Rust
last commit: 12 months ago
Linked from 1 awesome list
bpeopenairusttokenizer
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | A Ruby-based library for splitting written text into tokens for natural language processing tasks. | 46 |
| | A tool for tokenizing raw text into words and sentences in multiple languages, including Hungarian. | 4 |
| | A Ruby library that tokenizes text into sentences using a Bayesian statistical model | 80 |
| | A fast and simple tokenizer for multiple languages | 28 |
| | A Swift library for tokenizing strings with customizable matching behavior | 689 |
| | A gem for extracting words from text with customizable tokenization rules | 31 |
| | A tool for creating customizable tokenization rules for natural languages | 22 |
| | Rust bindings for a cross-platform GUI library | 1,646 |
| | A multilingual tokenizer to split strings into tokens, handling various language and formatting nuances. | 90 |
| | A tokeniser for natural language text that separates words from punctuation and supports basic preprocessing steps such as case changing | 66 |
| | A Ruby port of a Japanese text tokenization algorithm | 21 |
| | A C++ tokenizer that tokenizes Hungarian text | 14 |
| | A tokenizer for segmenting words into morphological components | 27 |
| | A Ruby library that tokenizes input and provides various statistical measures about the tokens | 159 |
| | A Thai word tokenization library using Deep Neural Network | 421 |