Multilingual-BERT
NER model tests
Investigating multilingual language models for Named Entity Recognition in German and English
Investigating multilingual language models (BERT) by using them for NER in German and English
14 stars
3 watching
0 forks
Language: Jupyter Notebook
last commit: over 5 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
allenai/scibert | A BERT model trained on scientific text for natural language processing tasks | 1,521 |
dbmdz/berts | Provides pre-trained language models for natural language processing tasks | 155 |
kamalkraj/bert-ner | An implementation of named entity recognition using Google's BERT model for the CoNLL-2003 dataset and Python. | 1,211 |
lemonhu/ner-bert-pytorch | A PyTorch implementation of named entity recognition using the Google AI's pre-trained BERT model for Chinese text data. | 438 |
dfki-nlp/gevalm | Evaluates German transformer language models with syntactic agreement tests | 7 |
certainlyio/nordic_bert | Provides pre-trained BERT models for Nordic languages with limited training data. | 161 |
szegedai/hun_ner_checklist | Provides diagnostic test cases for evaluating Hungarian Named Entity Recognition models | 0 |
deeppavlov/slavic-bert-ner | A shared BERT model for NER tasks in Slavic languages, pre-trained on Bulgarian, Czech, Polish, and Russian texts. | 73 |
ncbi-nlp/bluebert | Pre-trained language models for biomedical natural language processing tasks | 558 |
xverse-ai/xverse-moe-a36b | Develops and publishes large multilingual language models with advanced mixing-of-experts architecture. | 36 |
german-nlp-group/german-transformer-training | Trains German transformer models to improve language understanding | 23 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
wannaphong/thai-ner | A Named Entity Recognition tool for the Thai language. | 53 |
eleutherai/polyglot | Large language models designed to perform well in multiple languages and address performance issues with current multilingual models. | 475 |
allegro/herbert | A BERT-based language model pre-trained on Polish corpora for understanding Polish language. | 65 |