guwenbert
Chinese Model
Pre-trained language model for classical Chinese texts using RoBERTa architecture
GuwenBERT: 古文预训练语言模型(古文BERT) A Pre-trained Language Model for Classical Chinese (Literary Chinese)
511 stars
6 watching
38 forks
last commit: over 3 years ago bertclassical-chineseguwenbertliterary-chinesetransformers
Related projects:
Repository | Description | Stars |
---|---|---|
zhuiyitechnology/wobert | A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation | 460 |
ymcui/macbert | Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 646 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
yunwentechnology/unilm | This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |
shannonai/chinesebert | A deep learning model that incorporates visual and phonetic features of Chinese characters to improve its ability to understand Chinese language nuances | 545 |
cluebenchmark/electra | Trains and evaluates a Chinese language model using adversarial training on a large corpus. | 140 |
sww9370/rocbert | A pre-trained Chinese language model designed to be robust against maliciously crafted texts | 15 |
naver/biobert-pretrained | Provides pre-trained weights for a biomedical language representation model | 672 |
ieit-yuan/yuan2.0-m32 | A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 182 |
brightmart/xlnet_zh | Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
cluebenchmark/cluepretrainedmodels | Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. | 806 |
zhuiyitechnology/pretrained-models | A collection of pre-trained language models for natural language processing tasks | 989 |
zhuiyitechnology/gau-alpha | An implementation of a transformer-based NLP model utilizing gated attention units | 98 |
allenai/scibert | A BERT model trained on scientific text for natural language processing tasks | 1,532 |
skyworkaigc/skytext-chinese-gpt3 | An AI-powered text generation model trained on Chinese data to perform various tasks such as conversation, translation, and content creation. | 418 |