meProp

Gradient optimization method

A technique to simplify backpropagation in neural networks by selectively computing only the most relevant gradients

meProp: Sparsified Back Propagation for Accelerated Deep Learning (ICML 2017)

GitHub

110 stars
12 watching
20 forks
Language: C#
last commit: over 2 years ago
back-propagationcomputer-visiondeep-learningmepropnatural-language-processingnerual-network

Related projects:

Repository Description Stars
mstksg/backprop A Haskell library providing automatic heterogeneous back-propagation for differentiable programming and deep learning applications. 181
neuralmagic/sparseml Enables the creation of smaller neural network models through efficient pruning and quantization techniques 2,071
cn-upb/deepcomp A reinforcement learning-based system for optimizing multi-cell selection in wireless networks 58
ml-postech/gradient-inversion-generative-image-prior An implementation of a method to invert gradients in federated learning to potentially reveal sensitive client data 39
xternalz/sdpoint A deep learning method for optimizing convolutional neural networks by reducing computational cost while improving regularization and inference efficiency. 18
google-deepmind/optax A gradient processing and optimization library designed to facilitate research and productivity in machine learning by providing building blocks for custom optimizers and gradient processing components. 1,697
deng-cy/deep_learning_topology_opt A toolkit for using machine learning to optimize complex geometries in simulations 107
rentruewang/koila A lightweight wrapper around PyTorch to prevent CUDA out-of-memory errors and optimize model execution 1,821
saschagrunert/nn A small neural network implementation of the backpropagation algorithm in Haskell 127
zou-group/textgrad An autograd engine for textual gradients using large language models to backpropagate gradients. 1,821
intel/neural-compressor Tools and techniques for optimizing large language models on various frameworks and hardware platforms. 2,226
delta2323/gb-gnn Analyzes and optimizes the performance of graph neural networks using gradient boosting and various aggregation models. 13
pouyamghari/pof-mkl An implementation of an online federated learning algorithm with multiple kernels for personalized machine learning 0
harshakokel/kigb An open-source software framework that integrates human advice into gradient boosting decision trees for improved performance in machine learning tasks. 8
lanl-ansi/watermodels.jl A Julia package for solving optimization problems in water distribution networks 73