BERT-pytorch
BERT model
An implementation of Google's 2018 BERT model in PyTorch, allowing pre-training and fine-tuning for natural language processing tasks
Google AI 2018 BERT pytorch implementation
6k stars
126 watching
1k forks
Language: Python
last commit: about 2 years ago bertlanguage-modelnlppytorchtransformer
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | Implements RoBERTa for Chinese pre-training using TensorFlow and provides PyTorch versions for loading and training | 2,638 |
| | Provides pre-trained models and code for natural language processing tasks using TensorFlow | 38,374 |
| | A PyTorch implementation of named entity recognition using the Google AI's pre-trained BERT model for Chinese text data. | 442 |
| | An implementation of transformer models in PyTorch for natural language processing tasks | 1,257 |
| | An interactive tool for visualizing attention in Transformer language models. | 7,019 |
| | A toolkit for building pre-training models and fine-tuning them on downstream tasks in natural language processing | 3,018 |
| | An NLP project offering various text classification models and techniques for deep learning exploration | 7,881 |
| | An explanation of key concepts and advancements in the field of Machine Learning | 7,352 |
| | Provides pre-trained Rust-native transformer-based models and pipelines for Natural Language Processing tasks. | 2,694 |
| | Implementations of various deep learning algorithms and techniques with accompanying documentation | 57,177 |
| | A collection of pre-trained machine learning models for various natural language and computer vision tasks, enabling developers to fine-tune and deploy these models on their own projects. | 136,357 |
| | Trains German transformer models to improve language understanding | 23 |
| | Provides tools and libraries for training and fine-tuning large language models using transformer architectures | 6,215 |
| | Provides dense vector representations for text using transformer networks | 15,556 |
| | A unified library for parameter-efficient and modular transfer learning in NLP tasks | 2,600 |