Model
Language Models
A repository of pre-trained language models for various tasks and domains.
“悟道”模型
121 stars
3 watching
13 forks
last commit: over 3 years ago Related projects:
Repository | Description | Stars |
---|---|---|
| Pre-trains a multilingual model to bridge vision and language modalities for various downstream applications | 279 |
| This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |
| A collection of lightweight state-of-the-art language models designed to support multilinguality, coding, and reasoning tasks on constrained resources. | 232 |
| A collection of pre-trained language models for natural language processing tasks | 989 |
| A collection of multilingual language models trained on a dataset of instructions and responses in various languages. | 94 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| A library providing a pre-trained language model for natural language inference tasks using a transformer architecture. | 61 |
| A series of large language models trained from scratch to excel in multiple NLP tasks | 7,743 |
| A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 182 |
| Develops large-scale pre-trained models for Chinese natural language understanding and generative tasks with the goal of building efficient and effective models for various applications. | 163 |
| Provides pre-trained language models and tools for fine-tuning and evaluation | 439 |
| Korea University Large Language Model developed by researchers at Korea University and HIAI Research Institute. | 576 |
| Trains and evaluates a Chinese language model using adversarial training on a large corpus. | 140 |
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. | 806 |