FreezeOut

Layer freezeout

A technique to accelerate neural network training by progressively freezing layers

Accelerate Neural Net Training by Progressively Freezing Layers

GitHub

211 stars
8 watching
31 forks
Language: Python
last commit: over 6 years ago
deep-learningdensenetmachine-learningmemesneural-networkspytorchvgg16wide-residual-networks

Related projects:

Repository Description Stars
ajbrock/smash An experimental method for efficiently searching neural network architectures using a single training run. 489
microsoft/archai Automates the search for optimal neural network configurations in deep learning applications 467
alexbrillant/multi-layer-perceptron An implementation of a multi-layer neural network in Python, allowing users to train and use the network for classification tasks. 5
ajtulloch/dnngraph A DSL and toolkit for designing and optimizing deep neural networks in Haskell 700
ahmedfgad/neuralgenetic Tools and techniques for training neural networks using genetic algorithms 240
kimhc6028/forward-thinking-pytorch An implementation of a novel neural network training method that builds and trains networks one layer at a time. 65
zhanghang1989/pytorch-encoding A Python framework for building deep learning models with optimized encoding layers and batch normalization. 2,041
neuralmagic/sparseml Enables the creation of smaller neural network models through efficient pruning and quantization techniques 2,071
loudinthecloud/dpwa A distributed learning framework that enables peer-to-peer parameter averaging and asynchronous training of deep neural networks 53
tmllab/2021_neurips_pes Improves the performance of deep neural networks by selectively stopping training at different stages 29
jbarrow/lambdanet An artificial neural network library for rapid prototyping and extension in Haskell. 377
graal-research/poutyne A PyTorch framework simplifying neural network training with automated boilerplate code and callback utilities 569
ramon-oliveira/aorun A deep learning framework on top of PyTorch for building neural networks. 61
bamos/densenet.pytorch An implementation of a deep neural network architecture in PyTorch 832
jordanash/boostresnet An implementation of a deep neural network architecture using boosting theory to improve its performance 5