FAR-HO
Hyperparameter Optimizer
A package for optimizing hyperparameters and meta-learning using gradient-based methods in TensorFlow.
Gradient based hyperparameter optimization & meta-learning package for TensorFlow
188 stars
13 watching
47 forks
Language: Jupyter Notebook
last commit: almost 5 years ago gradient-descenthyperparameter-optimizationoptimizationtensorflow
Related projects:
Repository | Description | Stars |
---|---|---|
| A reinforcement learning-based framework for optimizing hyperparameters in distributed machine learning environments. | 15 |
| A collection of algorithms for hyperparameter optimization in machine learning models | 417 |
| A toolset for optimizing hyperparameters of machine learning models using Bayesian optimization and real-time visualization. | 136 |
| A toolbox for efficient hyperparameter tuning in deep learning using Bayesian optimization and automatic differentiation | 23 |
| An optimization framework for machine learning hyperparameters | 1,093 |
| Automates hyperparameter optimization and result saving across machine learning algorithms | 706 |
| An algorithm for training self-generalizing gradient boosting machines with automatic hyperparameter optimization and improved performance on various machine learning tasks | 321 |
| Automates search for optimal parameters in machine learning algorithms. | 1,594 |
| Provides tools and techniques for tuning hyperparameters in machine learning models to improve performance. | 1,283 |
| Automates hyperparameter optimization and neural network architecture search using Hyperopt on a CNN model for the CIFAR-100 dataset | 106 |
| A simple wrapper around Keras and Hyperopt for convenient hyperparameter optimization. | 2,179 |
| A tool for optimizing large language models by collecting feedback and metrics to improve their performance over time | 1,245 |
| An optimized version of TensorFlow to support newer hardware and libraries for NVIDIA GPU users | 1,017 |
| Automates model building and deployment process by optimizing hyperparameters and compressing models for edge computing. | 200 |
| Automated hyperparameter tuning and feature selection using evolutionary algorithms. | 316 |