xlnet_zh

Chinese language model

Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks

中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large

GitHub

230 stars
6 watching
35 forks
Language: Python
last commit: over 5 years ago
bertlanguage-modelpre-trainrobertaxlnet

Related projects:

Repository Description Stars
ymcui/chinese-xlnet Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture 1,652
cluebenchmark/cluepretrainedmodels Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. 806
shawn-ieitsystems/yuan-1.0 Large-scale language model with improved performance on NLP tasks through distributed training and efficient data processing 591
ymcui/macbert Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks 646
tencent/tencent-hunyuan-large This project makes a large language model accessible for research and development 1,245
cluebenchmark/electra Trains and evaluates a Chinese language model using adversarial training on a large corpus. 140
zhuiyitechnology/wobert A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation 460
yunwentechnology/unilm This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. 439
nkcs-iclab/linglong A pre-trained Chinese language model with a modest parameter count, designed to be accessible and useful for researchers with limited computing resources. 18
zhuiyitechnology/pretrained-models A collection of pre-trained language models for natural language processing tasks 989
hit-scir/chinese-mixtral-8x7b An implementation of a large language model for Chinese text processing, focusing on MoE (Multi-Headed Attention) architecture and incorporating a vast vocabulary. 645
01-ai/yi A series of large language models trained from scratch to excel in multiple NLP tasks 7,743
ymcui/pert Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels 356
ymcui/lert A pre-trained language model designed to leverage linguistic features and outperform comparable baselines on Chinese natural language understanding tasks. 202
langboat/mengzi Develops lightweight yet powerful pre-trained models for natural language processing tasks 533