RepDistiller
Knowledge Distiller
A Python-based project implementing contrastive representation distillation and benchmarking recent knowledge distillation methods
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
2k stars
17 watching
400 forks
Language: Python
last commit: about 1 year ago Related projects:
Repository | Description | Stars |
---|---|---|
elephantmipt/bert-distillation | A high-level API for distilling BERT models to create smaller, more efficient variants with reduced training time and improved inference speed. | 75 |
autodistill/autodistill | Automatically trains models from large foundation models to perform specific tasks with minimal human intervention. | 1,997 |
yoshitomo-matsubara/torchdistill | A framework for designing and running deep learning experiments without writing code | 1,392 |
jquesnelle/txt2imghd | Creates detailed images by upscaling and refining Stable Diffusion outputs | 694 |
ag14774/diffdist | Enables backpropagation in distributed settings and facilitates model parallelism using differentiable communication between processes | 61 |
pylons/colander | A library for serializing and deserializing data structures into strings, mappings, and lists while performing validation. | 451 |
haozhaowang/dafkd2023 | A framework for achieving domain-aware knowledge distillation in federated learning environments. | 26 |
ambitioninc/kmatch | A language for validating and filtering Python dictionaries | 48 |
imodpasteur/lutorpy | A Python library that enables seamless interaction between deep learning frameworks and Lua/Torch libraries. | 233 |
asonge/loom | A collection of composable and extensible conflict-free data types designed to track causality for modifications | 224 |
sharplispers/ironclad | A cryptographic toolkit written in Common Lisp. | 175 |
nkohari/kseq | An implementation of a simple CRDT that represents an ordered sequence of items, designed to handle concurrent modifications in collaborative editing systems. | 57 |
accenture/ampligraph | A Python library for training models on knowledge graphs to predict links between concepts | 2,157 |
jay15summer/two-stage-tradaboost.r2 | An implementation of a boosting-based transfer learning algorithm for regression tasks. | 44 |
rentruewang/koila | A lightweight wrapper around PyTorch to prevent CUDA out-of-memory errors and optimize model execution | 1,821 |