textoken

Tokenizer

A gem for extracting words from text with customizable tokenization rules

Simple and customizable text tokenization gem.

GitHub

31 stars
3 watching
3 forks
Language: Ruby
last commit: about 3 years ago
Linked from 1 awesome list

nlprubyrubynlptokenization

Backlinks from these awesome lists:

Related projects:

Repository Description Stars
arbox/tokenizer A Ruby-based library for splitting written text into tokens for natural language processing tasks. 46
zencephalon/tactful_tokenizer A Ruby library that tokenizes text into sentences using a Bayesian statistical model 80
abitdodgy/words_counted A Ruby library that tokenizes input and provides various statistical measures about the tokens 159
zseder/huntoken A tool for tokenizing raw text into words and sentences in multiple languages. 3
6/tiny_segmenter A Ruby port of a Japanese text tokenization algorithm 21
diasks2/pragmatic_tokenizer A multilingual tokenizer to split strings into tokens, handling various language and formatting nuances. 90
juliatext/wordtokenizers.jl A set of high-performance tokenizers for natural language processing tasks 96
shonfeder/tokenize A Prolog-based tokenization library for lexing text into common tokens 11
xujiajun/gotokenizer A tokenizer based on dictionary and Bigram language models for text segmentation in Chinese 21
mathewsanders/mustard A Swift library for tokenizing strings with customizable matching behavior 689
denosaurs/tokenizer A simple tokenizer library for parsing and analyzing text input in various formats. 17
c4n/pythonlexto A Python wrapper around the Thai word segmentator LexTo, allowing developers to easily integrate it into their applications. 1
jonsafari/tok-tok A fast and simple tokenizer for multiple languages 28
languagemachines/ucto A tokeniser for natural language text that separates words from punctuation and supports basic preprocessing steps such as case changing 65
neurosnap/sentences A command line tool to split text into individual sentences 440