BERT-pytorch

BERT model

An implementation of Google's 2018 BERT model in PyTorch, allowing pre-training and fine-tuning for natural language processing tasks

Google AI 2018 BERT pytorch implementation

GitHub

6k stars
126 watching
1k forks
Language: Python
last commit: over 1 year ago
bertlanguage-modelnlppytorchtransformer

Related projects:

Repository Description Stars
brightmart/roberta_zh Implements RoBERTa for Chinese pre-training using TensorFlow and provides PyTorch versions for loading and training 2,638
google-research/bert Provides pre-trained models and code for natural language processing tasks using TensorFlow 38,374
lemonhu/ner-bert-pytorch A PyTorch implementation of named entity recognition using the Google AI's pre-trained BERT model for Chinese text data. 442
tongjilibo/bert4torch An implementation of transformer models in PyTorch for natural language processing tasks 1,257
jessevig/bertviz An interactive tool for visualizing attention in Transformer language models. 7,019
dbiir/uer-py A toolkit for building pre-training models and fine-tuning them on downstream tasks in natural language processing 3,018
brightmart/text_classification An NLP project offering various text classification models and techniques for deep learning exploration 7,881
dair-ai/ml-papers-explained An explanation of key concepts and advancements in the field of Machine Learning 7,352
guillaume-be/rust-bert Provides pre-trained Rust-native transformer-based models and pipelines for Natural Language Processing tasks. 2,694
labmlai/annotated_deep_learning_paper_implementations Implementations of various deep learning algorithms and techniques with accompanying documentation 57,177
huggingface/transformers A collection of pre-trained machine learning models for various natural language and computer vision tasks, enabling developers to fine-tune and deploy these models on their own projects. 136,357
german-nlp-group/german-transformer-training Trains German transformer models to improve language understanding 23
google-research/text-to-text-transfer-transformer Provides tools and libraries for training and fine-tuning large language models using transformer architectures 6,215
ukplab/sentence-transformers Provides dense vector representations for text using transformer networks 15,556
adapter-hub/adapters A unified library for parameter-efficient and modular transfer learning in NLP tasks 2,600