Mengzi

NLP model library

Develops lightweight yet powerful pre-trained models for natural language processing tasks

Mengzi Pretrained Models

GitHub

533 stars
11 watching
63 forks
last commit: about 2 years ago
bertchinese-bertdeep-learninglanguage-understandingnatural-language-processingnlppytorch

Related projects:

Repository Description Stars
langboat/mengzi3 An 8B and 13B language model based on the Llama architecture with multilingual capabilities. 2,031
zhuiyitechnology/pretrained-models A collection of pre-trained language models for natural language processing tasks 989
dbmdz/berts Provides pre-trained language models for natural language processing tasks 155
turkunlp/wikibert Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks 34
ymcui/macbert Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks 646
balavenkatesh3322/nlp-pretrained-model A collection of pre-trained natural language processing models 170
ymcui/chinese-xlnet Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture 1,652
brightmart/xlnet_zh Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks 230
ymcui/pert Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels 356
namisan/mt-dnn A PyTorch package implementing multi-task deep neural networks for natural language understanding 2,238
allegro/herbert A BERT-based language model pre-trained on Polish corpora for understanding Polish language. 65
ncbi-nlp/bluebert Pre-trained language models for biomedical natural language processing tasks 560
vinairesearch/phobert Pre-trained language models for Vietnamese NLP tasks 671
01-ai/yi A series of large language models trained from scratch to excel in multiple NLP tasks 7,743