guwenbert
Classical Chinese Model
A pre-trained language model for classical Chinese based on RoBERTa and ancient literature.
GuwenBERT: 古文预训练语言模型(古文BERT) A Pre-trained Language Model for Classical Chinese (Literary Chinese)
506 stars
6 watching
38 forks
last commit: about 3 years ago bertclassical-chineseguwenbertliterary-chinesetransformers
Related projects:
Repository | Description | Stars |
---|---|---|
zhuiyitechnology/wobert | A pre-trained Chinese language model that uses word embeddings and is designed to process Chinese text | 458 |
ymcui/macbert | Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 645 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
yunwentechnology/unilm | This project provides pre-trained models for natural language understanding and generation tasks using the UniLM architecture. | 438 |
shannonai/chinesebert | A deep learning model that incorporates visual and phonetic features of Chinese characters to improve its ability to understand Chinese language nuances | 542 |
cluebenchmark/electra | Trains and evaluates a Chinese language model using adversarial training on a large corpus. | 140 |
sww9370/rocbert | A pre-trained Chinese language model designed to be robust against maliciously crafted texts | 15 |
naver/biobert-pretrained | Provides pre-trained weights for a biomedical language representation model | 667 |
ieit-yuan/yuan2.0-m32 | A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 180 |
brightmart/xlnet_zh | Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
cluebenchmark/cluepretrainedmodels | Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. | 804 |
zhuiyitechnology/pretrained-models | A collection of pre-trained language models for natural language processing tasks | 987 |
zhuiyitechnology/gau-alpha | An implementation of a Gated Attention Unit-based Transformer model for natural language processing tasks | 96 |
allenai/scibert | A BERT model trained on scientific text for natural language processing tasks | 1,521 |
skyworkaigc/skytext-chinese-gpt3 | An AI-powered text generation model trained on Chinese data to perform various tasks such as conversation, translation, and content creation. | 419 |