OFA-Chinese
Transformer-based OFA model
Transforms the OFA-Chinese model to work with the Hugging Face Transformers framework
transformers结构的中文OFA模型
123 stars
5 watching
16 forks
Language: Python
last commit: about 2 years ago Related projects:
Repository | Description | Stars |
---|---|---|
| A pre-trained transformer model for natural language understanding and generation tasks in Chinese | 482 |
| A pre-trained Chinese language model based on the Transformer-XL architecture. | 218 |
| An implementation of a transformer-based NLP model utilizing gated attention units | 98 |
| An enhanced transformer model with improved relative position embeddings for natural language processing tasks | 837 |
| An implementation of a Chinese language model using PyTorch and transformer architecture. | 262 |
| An improved version of a transformer-based language model with enhanced speed and accuracy through structural simplification and pre-training | 148 |
| Implementing OpenAI's transformer language model in PyTorch with pre-trained weights and fine-tuning capabilities | 1,511 |
| This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |
| Develops unified sentence embedding models for NLP tasks | 840 |
| Converts popular transformer models to run on Android devices for efficient inference and generation tasks. | 396 |
| Generates stylized talking faces and videos using deep learning models | 278 |
| A Python implementation of a StyleGAN-based anime face generator | 151 |
| Laravel library to convert models into standardized API responses | 29 |
| Trains German transformer models to improve language understanding | 23 |
| Large-scale language model with improved performance on NLP tasks through distributed training and efficient data processing | 591 |