torchdistill
Experiment builder
A framework for designing and running deep learning experiments without writing code
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
1k stars
20 watching
131 forks
Language: Python
last commit: 2 months ago
Linked from 1 awesome list
amazon-sagemaker-labcifar10cifar100cococolab-notebookgluegoogle-colabimage-classificationimagenetknowledge-distillationnatural-language-processingnlpobject-detectionpascal-vocpytorchsemantic-segmentationtransformer
Related projects:
Repository | Description | Stars |
---|---|---|
| A PyTorch implementation on Apache Spark for distributed deep learning model training and inference. | 339 |
| A comprehensive collection of deep learning examples using PyTorch, covering various applications and models. | 537 |
| Provides a CUDA backend for the PyTorch deep learning framework | 337 |
| A deep learning framework on top of PyTorch for building neural networks. | 61 |
| A PyTorch model fitting library designed to simplify the process of training deep learning models. | 636 |
| A Pythonic framework for designing and simulating digital circuits | 261 |
| A Python framework for building deep learning models with optimized encoding layers and batch normalization. | 2,044 |
| A PyTorch module for dynamic batching and optimized computation on deep neural networks | 221 |
| A framework for creating linguistic experiments using HTML, CSS, and JavaScript. | 35 |
| Provides a flexible and configurable framework for training deep learning models with PyTorch. | 1,196 |
| A deep learning framework for probabilistic models that extends PyTorch with support for reparameterized distributions and Monte Carlo estimators. | 887 |
| A JavaScript library that provides GPU-accelerated deep learning capabilities with automatic differentiation and neural network layers. | 1,093 |
| Automatically trains models from large foundation models to perform specific tasks with minimal human intervention. | 2,022 |
| Provides a Python-based framework for building distributed JAX training and evaluation experiments | 153 |
| Implementations of joint unsupervised learning algorithm for deep representations and image clusters using Torch | 289 |