dpwa
Async neural network trainer
A distributed learning framework that enables peer-to-peer parameter averaging and asynchronous training of deep neural networks
Distributed Learning by Pair-Wise Averaging
53 stars
5 watching
6 forks
Language: Python
last commit: over 7 years ago asynchronous-learningdeep-learningdistributed-systemsgossipingneural-networkspythonpytorchsgd
Related projects:
Repository | Description | Stars |
---|---|---|
| A PyTorch library for decentralized deep learning across the Internet. | 2,078 |
| An implementation of a novel neural network training method that builds and trains networks one layer at a time. | 66 |
| A Python library for training neural networks with focus on hydrological applications using PyTorch. | 372 |
| A toolkit for training neural networks on the ImageNet dataset using multiple GPUs. | 402 |
| Trains neural networks to be provably robust against adversarial examples using abstract interpretation techniques. | 219 |
| Automates large-scale deep learning training on distributed clusters, providing fault tolerance and fast recovery from failures. | 1,302 |
| An implementation of Neural Turing Machines in PyTorch | 592 |
| A PyTorch-based neural network training framework with advanced features and utilities | 398 |
| A Python implementation of a distributed machine learning framework for training neural networks on multiple GPUs | 6 |
| A library for training and evaluating neural networks with a focus on adversarial robustness. | 921 |
| A PyTorch framework simplifying neural network training with automated boilerplate code and callback utilities | 572 |
| Automates the search for optimal neural network configurations in deep learning applications | 468 |
| A PyTorch framework for managing and automating deep learning training loops with features like hyperparameter tracking and single-file deployments. | 40 |
| A deep learning framework on top of PyTorch for building neural networks. | 61 |
| Trains artificial neural networks using the genetic algorithm | 241 |