nasbot

Architecture optimizer

An implementation of Neural Architecture Search with Bayesian Optimization and Optimal Transport

Neural Architecture Search with Bayesian Optimisation and Optimal Transport

GitHub

133 stars
11 watching
26 forks
Language: Python
last commit: about 6 years ago
bayesian-optimizationgaussian-processesneural-architecture-searchoptimal-transport

Related projects:

Repository Description Stars
liyanghart/hyperparameter-optimization-of-machine-learning-algorithms Provides tools and techniques for tuning hyperparameters in machine learning models to improve performance. 1,283
syne-tune/syne-tune A tool for large-scale and asynchronous hyperparameter optimization in machine learning 393
guillaume-chevalier/hyperopt-keras-cnn-cifar-100 Automates hyperparameter optimization and neural network architecture search using Hyperopt on a CNN model for the CIFAR-100 dataset 106
nicholas-leonard/drmad A toolbox for efficient hyperparameter tuning in deep learning using Bayesian optimization and automatic differentiation 23
autonomio/talos A tool for automating hyperparameter experiments for machine learning models using TensorFlow and Keras 1,626
rodrigo-arenas/sklearn-genetic-opt Automated hyperparameter tuning and feature selection using evolutionary algorithms. 316
microsoft/archai Automates the search for optimal neural network configurations in deep learning applications 468
zygmuntz/hyperband A hyperparameter tuning framework with support for multiple machine learning models and algorithms. 594
gdikov/hypertunity A toolset for optimizing hyperparameters of machine learning models using Bayesian optimization and real-time visualization. 136
hyperopt/hyperopt-sklearn Automates search for optimal parameters in machine learning algorithms. 1,594
tobegit3hub/advisor An open-source hyperparameters tuning system for black box optimization 1,550
huntermcgushion/hyperparameter_hunter Automates hyperparameter optimization and result saving across machine learning algorithms 706
automl/smac3 An optimization framework for machine learning hyperparameters 1,093
perpetual-ml/perpetual An algorithm for training self-generalizing gradient boosting machines with automatic hyperparameter optimization and improved performance on various machine learning tasks 321
jmrichardson/tuneta Automates optimization of technical indicators for machine learning models in finance 421