inplace_abn
Memory optimizer
An optimization technique to reduce memory usage in deep neural networks during training
In-Place Activated BatchNorm for Memory-Optimized Training of DNNs
1k stars
39 watching
187 forks
Language: Python
last commit: 6 months ago Related projects:
Repository | Description | Stars |
---|---|---|
| A tool for optimizing deep learning models to reduce memory usage without sacrificing performance | 308 |
| An optimization library for reducing memory usage in PyTorch neural networks | 282 |
| A lightweight wrapper around PyTorch to prevent CUDA out-of-memory errors and optimize model execution | 1,823 |
| A deep learning method for optimizing convolutional neural networks by reducing computational cost while improving regularization and inference efficiency. | 18 |
| A Python package for gradient-based function optimization in machine learning | 181 |
| Enables the creation of smaller neural network models through efficient pruning and quantization techniques | 2,083 |
| An implementation of an optimization algorithm for training neural networks in machine learning environments. | 351 |
| Automated machine learning tool for tabular data pipelines | 343 |
| A JAX transform that simplifies the training of large language models by reducing memory usage through low-rank adaptation. | 134 |
| Analyzes and optimizes the performance of graph neural networks using gradient boosting and various aggregation models. | 13 |
| A programming language designed to optimize memory usage through long-term storage of data | 5 |
| An implementation of DARTS, a method for automatically designing neural network architectures. | 443 |
| A PyTorch module that adds differentiable optimization as a layer to neural networks | 517 |
| An optimisation method that minimises the difference between FEA output and data in Abaqus models | 17 |
| A reinforcement learning-based system for optimizing multi-cell selection in wireless networks | 58 |