├── CONTRIBUTING.md ├── LICENSE ├── README.md └── logo.jpg /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contibution guideline 2 | 3 | 1. Check if a model is already in the list 4 | 2. If no, add the model, filling all available information 5 | 3. Make a pull-request 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Bala venkatesh 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ![Maintenance](https://img.shields.io/badge/Maintained%3F-YES-green.svg) 2 | ![GitHub](https://img.shields.io/badge/Release-PROD-yellow.svg) 3 | ![GitHub](https://img.shields.io/badge/Languages-MULTI-blue.svg) 4 | ![GitHub](https://img.shields.io/badge/License-MIT-lightgrey.svg) 5 | 6 | # Hyper-parameter Tuning library 7 | 8 | ![CV logo](https://github.com/balavenkatesh3322/hyperparameter_tuning/blob/master/logo.jpg) 9 | 10 | ## What is Hyper-parameter Tuning? 11 | Parameters which define the model architecture are referred to as hyperparameters and thus this process of searching for the ideal model architecture is referred to as hyperparameter tuning. 12 | 13 | For instance, 14 | How many trees should I include in my random forest? 15 | How many neurons should I have in my neural network layer? 16 | How many layers should I have in my neural network? 17 | 18 | | Library Name | Description | Framework | 19 | | :---: | :---: | :---: | 20 | | [Keras Tuner]( https://github.com/keras-team/keras-tuner) | A hyperparameter tuner for Keras, specifically for tf.keras with TensorFlow 2.0. | `Keras` 21 | | [talos]( https://github.com/autonomio/talos) | Talos radically changes the ordinary Keras workflow by fully automating hyperparameter tuning and model evaluation. | `Keras` 22 | | [hyperas]( https://github.com/maxpumperla/hyperas) | A very simple convenience wrapper around hyperopt for fast prototyping with keras models. Hyperas lets you use the power of hyperopt without having to learn the syntax of it. | `Keras` 23 | 24 | 25 | | Library Name | Description | Framework | 26 | | :---: | :---: | :---: | 27 | | [Auto-PyTorch](https://github.com/automl/Auto-PyTorch) | This a very early pre-alpha version of our upcoming Auto-PyTorch. So far, Auto-PyTorch supports featurized data (classification, regression) and image data (classification). | `PyTorch` 28 | | [hypersearch]( https://github.com/kevinzakka/hypersearch) | une the hyperparameters of your PyTorch models with HyperSearch. | `PyTorch` 29 | | [botorch]( https://github.com/pytorch/botorch) | BoTorch is a library for Bayesian Optimization built on PyTorch. | `PyTorch` 30 | 31 | | Library Name | Description | Framework | 32 | | :---: | :---: | :---: | 33 | | [tpot]( https://github.com/EpistasisLab/tpot) | TPOT stands for Tree-based Pipeline Optimization Tool. Consider TPOT your Data Science Assistant. TPOT is a Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming. | `General` 34 | | [nni]( https://github.com/microsoft/nni) | NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate Feature Engineering, Neural Architecture Search, Hyperparameter Tuning and Model Compression. | `General` 35 | | [xcessiv]( https://github.com/reiinakano/xcessiv) | Xcessiv holds your hand through all the implementation details of creating and optimizing stacked ensembles so you're free to fully define only the things you care about. | `General` 36 | | [ray]( https://github.com/ray-project/ray) | Ray is a fast and simple framework for building and running distributed applications. | `General` 37 | | [tune-sklearn]( https://github.com/ray-project/tune-sklearn) | Tune-sklearn is a package that integrates Ray Tune's hyperparameter tuning and scikit-learn's models, allowing users to optimize hyerparameter searching for sklearn using Tune's schedulers. | `General` 38 | | [optuna]( https://github.com/optuna/optuna) | Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. | `General` 39 | | [Hyperopt]( https://github.com/hyperopt/hyperopt) | Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions. | `General` 40 | | [scikit-optimize]( https://github.com/scikit-optimize/scikit-optimize) | Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. | `General` 41 | | [Ax]( https://github.com/facebook/Ax) |Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments. | `General` 42 | | [Spearmint]( https://github.com/HIPS/Spearmint) |Spearmint is a software package to perform Bayesian optimization. | `General` 43 | | [hyperparameter_hunter]( https://github.com/HunterMcGushion/hyperparameter_hunter) | Automatically save and learn from Experiment results, leading to long-term, persistent optimization that remembers all your tests. | `General` 44 | | [sherpa]( https://github.com/sherpa-ai/sherpa) | A Python Hyperparameter Optimization Library | `General` 45 | | [auptimizer]( https://github.com/LGE-ARC-AdvancedAI/auptimizer) | Auptimizer is an optimization tool for Machine Learning (ML) that automates many of the tedious parts of the model building process. | `General` 46 | | [advisor]( https://github.com/tobegit3hub/advisor) | Advisor is the hyper parameters tuning system for black box optimization. | `General` 47 | | [test-tube](https://github.com/williamFalcon/test-tube ) | Test tube is a python library to track and parallelize hyperparameter search for Deep Learning and ML experiments. | `General` 48 | | [Determined]( https://github.com/determined-ai/determined) | Determined helps deep learning teams train models more quickly, easily share GPU resources, and effectively collaborate. | `General` 49 | 50 | 51 | ## Contributions 52 | Your contributions are always welcome!! 53 | Please have a look at `contributing.md` 54 | 55 | ## License 56 | 57 | [MIT License](LICENSE) 58 | -------------------------------------------------------------------------------- /logo.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/balavenkatesh3322/hyperparameter_tuning/6a06c16f94dcc152a0e9ccdd32249a3c34da74a3/logo.jpg --------------------------------------------------------------------------------