bert-distillation
BERT distiller
A high-level API for distilling BERT models to create smaller, more efficient variants with reduced training time and improved inference speed.
Distillation of BERT model with catalyst framework
75 stars
4 watching
7 forks
Language: Python
last commit: over 1 year ago bertcatalystdistilbertdistillationnlprubert
Related projects:
Repository | Description | Stars |
---|---|---|
| A Python-based project implementing contrastive representation distillation and benchmarking recent knowledge distillation methods | 2,217 |
| Automatically trains models from large foundation models to perform specific tasks with minimal human intervention. | 2,022 |
| Provides pre-trained language models for natural language processing tasks | 155 |
| A BERT-based pre-trained model for Chinese classical poetry | 146 |
| A Word-based Chinese BERT model trained on large-scale text data using pre-trained models as a foundation | 460 |
| Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| A framework for designing and running deep learning experiments without writing code | 1,409 |
| An implementation of a heterogenous federated learning framework using model distillation. | 150 |
| Automated sorting of data using a neural network model | 9 |
| Pre-trained language models for biomedical natural language processing tasks | 560 |
| An implementation of named entity recognition using Google's BERT model for the CoNLL-2003 dataset and Python. | 1,220 |
| This project implements a deep metric learning framework using an adversarial auxiliary loss to improve robustness. | 39 |
| Enables backpropagation in distributed settings and facilitates model parallelism using differentiable communication between processes | 62 |
| Implementing BERT-like NLP models in OCaml using PyTorch bindings and pre-trained weights from popular sources. | 24 |
| A toolkit for creating and manipulating state-of-the-art diffusion models in PyTorch | 8 |