torchdistill

Experiment builder

A framework for designing and running deep learning experiments without writing code

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

GitHub

1k stars
19 watching
131 forks
Language: Python
last commit: about 1 month ago
Linked from 1 awesome list

amazon-sagemaker-labcifar10cifar100cococolab-notebookgluegoogle-colabimage-classificationimagenetknowledge-distillationnatural-language-processingnlpobject-detectionpascal-vocpytorchsemantic-segmentationtransformer

Backlinks from these awesome lists:

Related projects:

Repository Description Stars
dmmiller612/sparktorch A PyTorch implementation on Apache Spark for distributed deep learning model training and inference. 339
ne7ermore/torch-light A comprehensive PyTorch-based deep learning repository with examples and implementations of various models and techniques. 535
torch/cutorch Provides a CUDA backend for the PyTorch deep learning framework 336
ramon-oliveira/aorun A deep learning framework on top of PyTorch for building neural networks. 61
pytorchbearer/torchbearer A PyTorch model fitting library designed to simplify the process of training deep learning models. 636
ucsbarchlab/pyrtl A Pythonic framework for designing and simulating digital circuits 257
zhanghang1989/pytorch-encoding A Python framework for building deep learning models with optimized encoding layers and batch normalization. 2,041
nearai/torchfold A PyTorch module for dynamic batching and optimized computation on deep neural networks 221
tlozoot/experigen A framework for creating linguistic experiments using HTML, CSS, and JavaScript. 34
open-mmlab/mmengine Provides a flexible and configurable framework for training deep learning models with PyTorch. 1,179
probtorch/probtorch A deep learning framework for probabilistic models that extends PyTorch with support for reparameterized distributions and Monte Carlo estimators. 887
eduardoleao052/js-pytorch A JavaScript library that provides GPU-accelerated deep learning capabilities with automatic differentiation and neural network layers. 1,084
autodistill/autodistill Automatically trains models from large foundation models to perform specific tasks with minimal human intervention. 1,983
google-deepmind/jaxline Provides a Python-based framework for building distributed JAX training and evaluation experiments 152
jwyang/jule.torch Implementations of joint unsupervised learning algorithm for deep representations and image clusters using Torch 288