sentences
Sentence tokenizer
A command line tool to split text into individual sentences
A multilingual command line sentence tokenizer in Golang
441 stars
14 watching
39 forks
Language: Go
last commit: 12 months ago
Linked from 3 awesome lists
clisentence-tokenizersentencestokenizer
Related projects:
Repository | Description | Stars |
---|---|---|
| A Ruby library that tokenizes text into sentences using a Bayesian statistical model | 80 |
| A simple tokenizer library for parsing and analyzing text input in various formats. | 17 |
| A gem for extracting words from text with customizable tokenization rules | 31 |
| A high-performance tokenization library for Go, capable of parsing various data formats and syntaxes. | 103 |
| A Ruby-based library for splitting written text into tokens for natural language processing tasks. | 46 |
| A set of high-performance tokenizers for natural language processing tasks | 96 |
| A Ruby library that tokenizes input and provides various statistical measures about the tokens | 159 |
| A fast and simple tokenizer for multiple languages | 28 |
| A Python wrapper around the Thai word segmentator LexTo, allowing developers to easily integrate it into their applications. | 1 |
| A multilingual tokenizer to split strings into tokens, handling various language and formatting nuances. | 90 |
| A tokeniser for natural language text that separates words from punctuation and supports basic preprocessing steps such as case changing | 66 |
| A Prolog-based tokenization library for lexing text into common tokens | 11 |
| A tool for tokenizing raw text into words and sentences in multiple languages, including Hungarian. | 4 |
| A tokenizer based on dictionary and Bigram language models for text segmentation in Chinese | 21 |
| Provides tools for splitting text into sentences and words | 171 |