CPT
Transformer model
A pre-trained transformer model for natural language understanding and generation tasks in Chinese
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
482 stars
5 watching
73 forks
Language: Python
last commit: about 2 years ago chineselanguage-understandingpretrained-modelsptmstext-generationtransformer-architecture
Related projects:
Repository | Description | Stars |
---|---|---|
| A pre-trained Chinese language model based on the Transformer-XL architecture. | 218 |
| Trains German transformer models to improve language understanding | 23 |
| This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,167 |
| An implementation of transformer models in PyTorch for natural language processing tasks | 1,257 |
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| Research tool for training large transformer language models at scale | 1,926 |
| A collection of tools and scripts for training large transformer language models at scale | 1,342 |
| An implementation of a transformer-based NLP model utilizing gated attention units | 98 |
| An implementation of deep learning transformer models in MATLAB | 209 |
| A comprehensive guide to using the Transformers library for natural language processing tasks | 1,220 |
| Provides a collection of reusable data transformation tools | 10 |
| An implementation of Reformer, an efficient Transformer model for natural language processing tasks. | 2,132 |
| Implementation of a transformer-based translation model in PyTorch | 240 |
| Implementing OpenAI's transformer language model in PyTorch with pre-trained weights and fine-tuning capabilities | 1,511 |
| This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |