Mengzi
NLP model library
Develops lightweight yet powerful pre-trained models for natural language processing tasks
Mengzi Pretrained Models
533 stars
11 watching
63 forks
last commit: about 2 years ago bertchinese-bertdeep-learninglanguage-understandingnatural-language-processingnlppytorch
Related projects:
Repository | Description | Stars |
---|---|---|
| An 8B and 13B language model based on the Llama architecture with multilingual capabilities. | 2,031 |
| A collection of pre-trained language models for natural language processing tasks | 989 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 646 |
| A collection of pre-trained natural language processing models | 170 |
| Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture | 1,652 |
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels | 356 |
| A PyTorch package implementing multi-task deep neural networks for natural language understanding | 2,238 |
| A BERT-based language model pre-trained on Polish corpora for understanding Polish language. | 65 |
| Pre-trained language models for biomedical natural language processing tasks | 560 |
| Pre-trained language models for Vietnamese NLP tasks | 671 |
| A series of large language models trained from scratch to excel in multiple NLP tasks | 7,743 |