PERT
Semantic learning model
Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels
PERT: Pre-training BERT with Permuted Language Model
356 stars
7 watching
24 forks
last commit: almost 2 years ago bertnlpplmpre-trained-modelpytorchtensorflowtransformers
Related projects:
Repository | Description | Stars |
---|---|---|
| Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 646 |
| A pre-trained language model designed to leverage linguistic features and outperform comparable baselines on Chinese natural language understanding tasks. | 202 |
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| Develops lightweight yet powerful pre-trained models for natural language processing tasks | 533 |
| A collection of pre-trained language models for natural language processing tasks | 989 |
| Improves pre-trained language models by encouraging an isotropic and discriminative distribution of token representations. | 92 |
| Provides pre-trained Chinese language models based on the ELECTRA framework for natural language processing tasks | 1,405 |
| Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture | 1,652 |
| A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation | 460 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| Develops and trains models for vision-language learning with decoupled language pre-training | 24 |
| Develops and releases Mixtral-based models for natural language processing tasks with a focus on Chinese text generation and understanding | 589 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. | 806 |
| An implementation of MobileBERT, a pre-trained language model, in Python for NLP tasks. | 81 |