HerBERT
Polish NLP model
A BERT-based language model pre-trained on Polish corpora for understanding Polish language.
HerBERT is a BERT-based Language Model trained on Polish Corpora using only MLM objective with dynamic masking of whole words.
65 stars
7 watching
5 forks
last commit: over 3 years ago
Linked from 1 awesome list
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | Provides pre-trained language models for natural language processing tasks | 155 |
| | A Polish BERT-based language model trained on various corpora for natural language processing tasks | 70 |
| | Develops lightweight yet powerful pre-trained models for natural language processing tasks | 533 |
| | Pre-trained models and resources for Natural Language Processing in Polish | 329 |
| | Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
| | Provides pre-trained BERT models for Nordic languages with limited training data. | 164 |
| | Investigating multilingual language models for Named Entity Recognition in German and English | 14 |
| | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| | Implementing BERT-like NLP models in OCaml using PyTorch bindings and pre-trained weights from popular sources. | 24 |
| | A shared BERT model for NER tasks in Slavic languages, pre-trained on Bulgarian, Czech, Polish, and Russian texts. | 73 |
| | A BERT model trained on scientific text for natural language processing tasks | 1,532 |
| | Evaluates German transformer language models with syntactic agreement tests | 7 |
| | A collection of pre-trained language models for natural language processing tasks | 989 |
| | Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels | 356 |
| | A pre-trained language model designed to improve natural language processing tasks in education | 186 |