polbert
Polish BERT model
A Polish BERT-based language model trained on various corpora for natural language processing tasks
Polish BERT
70 stars
8 watching
10 forks
Language: Jupyter Notebook
last commit: about 4 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
allegro/herbert | A BERT-based language model pre-trained on Polish corpora for understanding Polish language. | 65 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
ermlab/politbert | Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
deeppavlov/slavic-bert-ner | A shared BERT model for NER tasks in Slavic languages, pre-trained on Bulgarian, Czech, Polish, and Russian texts. | 73 |
certainlyio/nordic_bert | Provides pre-trained BERT models for Nordic languages with limited training data. | 164 |
dbmdz/berts | Provides pre-trained language models for natural language processing tasks | 155 |
zhuiyitechnology/wobert | A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation | 460 |
tonianelope/multilingual-bert | Investigating multilingual language models for Named Entity Recognition in German and English | 14 |
dfki-nlp/gevalm | Evaluates German transformer language models with syntactic agreement tests | 7 |
peleiden/daluke | A language model trained on Danish Wikipedia data for named entity recognition and masked language modeling | 9 |
german-nlp-group/german-transformer-training | Trains German transformer models to improve language understanding | 23 |
thunlp-aipoet/bert-ccpoem | A BERT-based pre-trained model for Chinese classical poetry | 146 |
sarnikowski/danish_transformers | An open-source collection of Danish language models for natural language processing tasks | 30 |
allenai/scibert | A BERT model trained on scientific text for natural language processing tasks | 1,532 |
tal-tech/edu-bert | A pre-trained language model designed to improve natural language processing tasks in education | 186 |