Tactful_Tokenizer

Sentence tokenizer

A Ruby library that tokenizes text into sentences using a Bayesian statistical model

Accurate Bayesian sentence tokenizer in Ruby.

GitHub

80 stars
5 watching
13 forks
Language: Ruby
last commit: over 10 years ago
Linked from 1 awesome list

nlprubyrubynlp

Backlinks from these awesome lists:

Related projects:

Repository Description Stars
thisiscetin/textoken A gem for extracting words from text with customizable tokenization rules 31
arbox/tokenizer A Ruby-based library for splitting written text into tokens for natural language processing tasks. 46
abitdodgy/words_counted A Ruby library that tokenizes input and provides various statistical measures about the tokens 159
diasks2/pragmatic_tokenizer A multilingual tokenizer to split strings into tokens, handling various language and formatting nuances. 90
lfcipriani/punkt-segmenter An implementation of a sentence boundary detection algorithm in Ruby. 92
neurosnap/sentences A command line tool to split text into individual sentences 439
6/tiny_segmenter A Ruby port of a Japanese text tokenization algorithm 21
jonsafari/tok-tok A fast and simple tokenizer for multiple languages 28
denosaurs/tokenizer A simple tokenizer library for parsing and analyzing text input in various formats. 17
zurawiki/tiktoken-rs Provides a Rust library for tokenizing text with OpenAI models using tiktoken. 256
shonfeder/tokenize A Prolog-based tokenization library for lexing text into common tokens 11
zseder/huntoken A tool for tokenizing raw text into words and sentences in multiple languages. 3
juliatext/wordtokenizers.jl A set of high-performance tokenizers for natural language processing tasks 96
amir-zeldes/rftokenizer A tokenizer for segmenting words into morphological components 27
xujiajun/gotokenizer A tokenizer based on dictionary and Bigram language models for text segmentation in Chinese 21