polbert
Polish BERT model
A Polish BERT-based language model trained on various corpora for natural language processing tasks
Polish BERT
70 stars
8 watching
10 forks
Language: Jupyter Notebook
last commit: over 4 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| A BERT-based language model pre-trained on Polish corpora for understanding Polish language. | 65 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
| A shared BERT model for NER tasks in Slavic languages, pre-trained on Bulgarian, Czech, Polish, and Russian texts. | 73 |
| Provides pre-trained BERT models for Nordic languages with limited training data. | 164 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation | 460 |
| Investigating multilingual language models for Named Entity Recognition in German and English | 14 |
| Evaluates German transformer language models with syntactic agreement tests | 7 |
| A language model trained on Danish Wikipedia data for named entity recognition and masked language modeling | 9 |
| Trains German transformer models to improve language understanding | 23 |
| A BERT-based pre-trained model for Chinese classical poetry | 146 |
| An open-source collection of Danish language models for natural language processing tasks | 30 |
| A BERT model trained on scientific text for natural language processing tasks | 1,532 |
| A pre-trained language model designed to improve natural language processing tasks in education | 186 |