german-transformer-training
Language model training
Trains German transformer models to improve language understanding
Plan and train German transformer models.
23 stars
6 watching
2 forks
Language: Python
last commit: about 4 years ago
Linked from 1 awesome list
berthuggingfacerobertatransformer
Related projects:
Repository | Description | Stars |
---|---|---|
| Evaluates German transformer language models with syntactic agreement tests | 7 |
| This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,167 |
| Research tool for training large transformer language models at scale | 1,926 |
| A pre-trained transformer model for natural language understanding and generation tasks in Chinese | 482 |
| A comprehensive guide to using the Transformers library for natural language processing tasks | 1,220 |
| A collection of tools and scripts for training large transformer language models at scale | 1,342 |
| An implementation of transformer models in PyTorch for natural language processing tasks | 1,257 |
| Custom German language model variants of GPT2 for natural language processing tasks. | 20 |
| An open-source collection of Danish language models for natural language processing tasks | 30 |
| Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| An implementation of deep learning transformer models in MATLAB | 209 |
| An implementation of a transformer-based NLP model utilizing gated attention units | 98 |
| Implementing OpenAI's transformer language model in PyTorch with pre-trained weights and fine-tuning capabilities | 1,511 |
| Develops pretraining and finetuning techniques for language models using metadata-conditioned text generation | 18 |