BERT-CCPoem
Poetry model
A BERT-based pre-trained model for Chinese classical poetry
BERT-CCPoem is an BERT-based pre-trained model particularly for Chinese classical poetry
145 stars
3 watching
19 forks
Language: Python
last commit: over 2 years ago bertpoetrypretrain
Related projects:
Repository | Description | Stars |
---|---|---|
zhuiyitechnology/wobert | A pre-trained Chinese language model that uses word embeddings and is designed to process Chinese text | 458 |
tchayintr/thbert | A pre-trained BERT model designed to facilitate NLP research and development with limited Thai language resources | 6 |
cluebenchmark/cluepretrainedmodels | Provides pre-trained models for Chinese language tasks with improved performance and smaller model sizes compared to existing models. | 804 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
brightmart/xlnet_zh | Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
langboat/mengzi | Develops lightweight yet powerful pre-trained models for natural language processing tasks | 534 |
allenai/scibert | A BERT model trained on scientific text for natural language processing tasks | 1,521 |
ymcui/pert | Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels | 354 |
ymcui/chinese-electra | Provides pre-trained Chinese language models based on the ELECTRA framework for natural language processing tasks | 1,403 |
ethan-yt/guwenbert | A pre-trained language model for classical Chinese based on RoBERTa and ancient literature. | 506 |
thunlp/openclap | A repository of pre-trained language models for natural language processing tasks in Chinese | 979 |
kldarek/polbert | A Polish BERT-based language model trained on various corpora for natural language processing tasks | 70 |
clue-ai/promptclue | A pre-trained language model for multiple natural language processing tasks with support for few-shot learning and transfer learning. | 654 |
ymcui/macbert | Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 645 |
siat-nlp/hanfei | Develops and trains a large-scale, parameterized model for legal question answering and text generation | 98 |