german-elmo-model
German Wiki Model
A pre-trained deep contextualized word representation model trained on a German Wikipedia corpus
This is a german ELMo deep contextualized word representation. It is trained on a special German Wikipedia Text Corpus.
28 stars
3 watching
1 forks
last commit: about 5 years ago
Linked from 1 awesome list
bilmelmoembeddinggermanmachine-learningnlppythontensorflow
Related projects:
Repository | Description | Stars |
---|---|---|
| This is a repository for a subword ELMo model pre-trained on a large corpus of text. | 12 |
| Provides pre-trained ELMo representations for multiple languages to improve NLP tasks. | 1,462 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| Efficient Contextual Representation Learning Model with Continuous Outputs | 4 |
| Trains German transformer models to improve language understanding | 23 |
| Large language models designed to perform well in multiple languages and address performance issues with current multilingual models. | 476 |
| Custom German language model variants of GPT2 for natural language processing tasks. | 20 |
| A language model trained on Danish Wikipedia data for named entity recognition and masked language modeling | 9 |
| A library providing a pre-trained language model for natural language inference tasks using a transformer architecture. | 61 |
| Evaluates German transformer language models with syntactic agreement tests | 7 |
| Reproduces the results of an ACL 2018 paper on simple word-embedding-based models for natural language processing tasks. | 284 |
| A BERT-based language model pre-trained on Polish corpora for understanding Polish language. | 65 |
| An implementation of DeepMind's Relational Recurrent Neural Networks (Santoro et al. 2018) in PyTorch for word language modeling | 245 |
| A collection of lightweight state-of-the-art language models designed to support multilinguality, coding, and reasoning tasks on constrained resources. | 232 |