edu-bert
Education NLP model
A pre-trained language model designed to improve natural language processing tasks in education
好未来开源教育领域首个在线教学中文预训练模型TAL-EduBERT
186 stars
32 watching
41 forks
Language: Python
last commit: about 4 years ago Related projects:
Repository | Description | Stars |
---|---|---|
| A BERT model trained on scientific text for natural language processing tasks | 1,532 |
| A collection of pre-trained natural language processing models | 170 |
| This repository contains source files and training scripts for language models. | 12 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| Trains German transformer models to improve language understanding | 23 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| A large language model designed for research and application in natural language processing tasks. | 887 |
| A collection of pre-trained language models for natural language processing tasks | 989 |
| A language model trained on Danish Wikipedia data for named entity recognition and masked language modeling | 9 |
| Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels | 356 |
| A pre-trained language model designed to leverage linguistic features and outperform comparable baselines on Chinese natural language understanding tasks. | 202 |
| A pre-trained language model for multiple natural language processing tasks with support for few-shot learning and transfer learning. | 656 |
| A collection of natural language processing models and tools for collaboration on a joint project between BAAI and JDAI. | 254 |
| Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |
| Improves pre-trained language models by encouraging an isotropic and discriminative distribution of token representations. | 92 |