german-transformer-training
Language model training
Trains German transformer models to improve language understanding
Plan and train German transformer models.
23 stars
6 watching
2 forks
Language: Python
last commit: over 3 years ago
Linked from 1 awesome list
berthuggingfacerobertatransformer
Related projects:
Repository | Description | Stars |
---|---|---|
dfki-nlp/gevalm | Evaluates German transformer language models with syntactic agreement tests | 7 |
openai/finetune-transformer-lm | This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,160 |
microsoft/megatron-deepspeed | Research tool for training large transformer language models at scale | 1,895 |
fastnlp/cpt | A pre-trained transformer model for natural language understanding and generation tasks in Chinese | 481 |
jsksxs360/how-to-use-transformers | A comprehensive guide to using the Transformers library for natural language processing tasks | 1,133 |
bigscience-workshop/megatron-deepspeed | A collection of tools and scripts for training large transformer language models at scale | 1,335 |
tongjilibo/bert4torch | An implementation of transformer models in PyTorch for natural language processing tasks | 1,241 |
bminixhofer/gerpt2 | Custom German language model variants of GPT2 for natural language processing tasks. | 20 |
sarnikowski/danish_transformers | An open-source collection of Danish language models for natural language processing tasks | 30 |
ermlab/politbert | Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
dbmdz/berts | Provides pre-trained language models for natural language processing tasks | 155 |
matlab-deep-learning/transformer-models | An implementation of deep learning transformer models in MATLAB | 206 |
zhuiyitechnology/gau-alpha | An implementation of a Gated Attention Unit-based Transformer model for natural language processing tasks | 96 |
huggingface/pytorch-openai-transformer-lm | Implementing OpenAI's transformer language model in PyTorch with pre-trained weights and fine-tuning capabilities | 1,511 |
proger/uk4b | Develops pretraining and finetuning techniques for language models using metadata-conditioned text generation | 18 |