roberta_zh
Chinese Pre-Trained Model
Implements RoBERTa for Chinese pre-training using TensorFlow and provides PyTorch versions for loading and training
RoBERTa中文预训练模型: RoBERTa for Chinese
3k stars
52 watching
409 forks
Language: Python
last commit: 4 months ago bertchinesegpt2pre-trainedpre-trained-language-modelsroberta
Related projects:
Repository | Description | Stars |
---|---|---|
codertimo/bert-pytorch | An implementation of Google's 2018 BERT model in PyTorch, allowing pre-training and fine-tuning for natural language processing tasks | 6,222 |
google-research/bert | Provides pre-trained models and code for natural language processing tasks using TensorFlow | 38,204 |
ermlab/politbert | Trains a language model using a RoBERTa architecture on high-quality Polish text data | 33 |
lemonhu/ner-bert-pytorch | A PyTorch implementation of named entity recognition using the Google AI's pre-trained BERT model for Chinese text data. | 438 |
guillaume-be/rust-bert | Provides pre-trained Rust-native transformer-based models and pipelines for Natural Language Processing tasks. | 2,651 |
brightmart/text_classification | An NLP project offering various text classification models and techniques for deep learning exploration | 7,861 |
namisan/mt-dnn | A PyTorch package implementing multi-task deep neural networks for natural language understanding | 2,238 |
german-nlp-group/german-transformer-training | Trains German transformer models to improve language understanding | 23 |
ncbi-nlp/bluebert | Pre-trained language models for biomedical natural language processing tasks | 558 |
jessevig/bertviz | An interactive tool for visualizing attention in Transformer language models. | 6,946 |
l0sg/relational-rnn-pytorch | An implementation of DeepMind's Relational Recurrent Neural Networks (Santoro et al. 2018) in PyTorch for word language modeling | 244 |
huawei-noah/pretrained-language-model | A collection of pre-trained language models and optimization techniques for efficient natural language processing | 3,028 |
tongjilibo/bert4torch | An implementation of transformer models in PyTorch for natural language processing tasks | 1,241 |
nvidia/fastertransformer | A high-performance transformer-based NLP component optimized for GPU acceleration and integration into various frameworks. | 5,886 |
backprop-ai/backprop | A Python library that provides pre-trained models and tools for fine-tuning and deploying natural language processing tasks | 243 |