PERT

Semantic learning model

Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels

PERT: Pre-training BERT with Permuted Language Model

GitHub

356 stars
7 watching
24 forks
last commit: over 1 year ago
bertnlpplmpre-trained-modelpytorchtensorflowtransformers

Related projects:

Repository Description Stars
ymcui/macbert Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks 646
ymcui/lert A pre-trained language model designed to leverage linguistic features and outperform comparable baselines on Chinese natural language understanding tasks. 202
brightmart/xlnet_zh Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks 230
langboat/mengzi Develops lightweight yet powerful pre-trained models for natural language processing tasks 533
zhuiyitechnology/pretrained-models A collection of pre-trained language models for natural language processing tasks 989
yxuansu/tacl Improves pre-trained language models by encouraging an isotropic and discriminative distribution of token representations. 92
ymcui/chinese-electra Provides pre-trained Chinese language models based on the ELECTRA framework for natural language processing tasks 1,405
ymcui/chinese-xlnet Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture 1,652
zhuiyitechnology/wobert A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation 460
dbmdz/berts Provides pre-trained language models for natural language processing tasks 155
yiren-jian/blitext Develops and trains models for vision-language learning with decoupled language pre-training 24
ymcui/chinese-mixtral Develops and releases Mixtral-based models for natural language processing tasks with a focus on Chinese text generation and understanding 589
turkunlp/wikibert Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks 34
cluebenchmark/cluepretrainedmodels Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. 806
ymcui/chinese-mobilebert An implementation of MobileBERT, a pre-trained language model, in Python for NLP tasks. 81