Tencent-Hunyuan-Large
Language Model
This project makes a large language model accessible for research and development
1k stars
25 watching
59 forks
Language: Python
last commit: 3 months ago Related projects:
Repository | Description | Stars |
---|---|---|
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |
| Large-scale language model with improved performance on NLP tasks through distributed training and efficient data processing | 591 |
| A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 182 |
| Large language model for dialogue support in multiple languages | 1,903 |
| Provides Chinese language models with high performance for image-text retrieval and classification tasks. | 51 |
| Measures the understanding of massive multitask Chinese datasets using large language models | 87 |
| A pre-trained Chinese language model with a modest parameter count, designed to be accessible and useful for researchers with limited computing resources. | 18 |
| Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture | 1,652 |
| Evaluates and benchmarks large language models' video understanding capabilities | 121 |
| A pre-trained Chinese language model based on the Transformer-XL architecture. | 218 |
| A lightweight, multilingual language model with a long context length | 920 |
| This repository provides pre-trained models and code for understanding and generation tasks in multiple languages. | 89 |
| Develops and releases large language models for financial applications with improved performance and features | 1,089 |
| An implementation of a large language model for Chinese text processing, focusing on MoE (Multi-Headed Attention) architecture and incorporating a vast vocabulary. | 645 |