packnet
Task Network Pruning Framework
A deep learning framework for adding multiple tasks to a single network by iterative pruning and evaluation on various datasets.
Code for PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning
230 stars
7 watching
40 forks
Language: Python
last commit: over 6 years ago Related projects:
Repository | Description | Stars |
---|---|---|
| Adapting a single network to multiple tasks by learning to mask weights | 183 |
| A framework for unsupervised network embedding using a multi-task Siamese neural network | 45 |
| A deep learning project implementing structured pruning algorithms in PyTorch for efficient neural network training and inference. | 112 |
| Enables the creation of smaller neural network models through efficient pruning and quantization techniques | 2,083 |
| This project enables reprogramming of pre-trained neural networks to work on new tasks by fine-tuning them on smaller datasets. | 33 |
| A deep learning framework that enables efficient and flexible distributed/mobile deep learning with dynamic dataflow dependency scheduling | 28 |
| An implementation of neural network components and optimization methods for text analysis, including rationales for neural predictions. | 355 |
| A deep learning framework for semantic segmentation using pre-trained classification networks and heterogeneous annotations | 74 |
| Re-implements sparse structure selection algorithm for deep neural networks in a modified MXNet framework. | 87 |
| This project provides a PyTorch implementation of pruning techniques to reduce the computational resources required for neural network inference. | 877 |
| Automates the search for optimal neural network configurations in deep learning applications | 468 |
| A Ruby interface to MXNet's deep learning framework | 48 |
| A collection of pre-trained neural network models with simple interfaces for easy integration into machine learning workflows. | 1,004 |
| A framework for decentralized multi-task learning of graph neural networks on molecular data with guaranteed convergence | 44 |
| A tool for training neural networks with pruned weights and evaluating their performance. | 140 |