 Tactful_Tokenizer
 Tactful_Tokenizer 
 Sentence tokenizer
 A Ruby library that tokenizes text into sentences using a Bayesian statistical model
Accurate Bayesian sentence tokenizer in Ruby.
80 stars
 5 watching
 13 forks
 
Language: Ruby 
last commit: over 11 years ago 
Linked from   1 awesome list  
  nlprubyrubynlp 
 Related projects:
| Repository | Description | Stars | 
|---|---|---|
|  | A gem for extracting words from text with customizable tokenization rules | 31 | 
|  | A Ruby-based library for splitting written text into tokens for natural language processing tasks. | 46 | 
|  | A Ruby library that tokenizes input and provides various statistical measures about the tokens | 159 | 
|  | A multilingual tokenizer to split strings into tokens, handling various language and formatting nuances. | 90 | 
|  | A Ruby port of the NLTK algorithm to detect sentence boundaries in unstructured text | 92 | 
|  | A command line tool to split text into individual sentences | 441 | 
|  | A Ruby port of a Japanese text tokenization algorithm | 21 | 
|  | A fast and simple tokenizer for multiple languages | 28 | 
|  | A simple tokenizer library for parsing and analyzing text input in various formats. | 17 | 
|  | Provides a Rust library for tokenizing text with OpenAI models using tiktoken. | 266 | 
|  | A Prolog-based tokenization library for lexing text into common tokens | 11 | 
|  | A tool for tokenizing raw text into words and sentences in multiple languages, including Hungarian. | 4 | 
|  | A set of high-performance tokenizers for natural language processing tasks | 96 | 
|  | A tokenizer for segmenting words into morphological components | 27 | 
|  | A tokenizer based on dictionary and Bigram language models for text segmentation in Chinese | 21 |