From 363197cef8c092240f39ef61b7b8f2883d5ebb0d Mon Sep 17 00:00:00 2001 From: Qingyun Wu Date: Fri, 30 Apr 2021 23:19:49 -0400 Subject: [PATCH] Blendsearch documentation (#81) * blendsearch documentation Co-authored-by: Chi Wang --- README.md | 38 ++++++++++++++++++++++---------------- flaml/tune/README.md | 10 +++++++--- 2 files changed, 29 insertions(+), 19 deletions(-) diff --git a/README.md b/README.md index b15b9656d5..5304234e64 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,22 @@ and learner selection method invented by Microsoft Research. FLAML leverages the structure of the search space to choose a search order optimized for both cost and error. For example, the system tends to propose cheap configurations at the beginning stage of the search, but quickly moves to configurations with high model complexity and large sample size when needed in the later stage of the search. For another example, it favors cheap learners in the beginning but penalizes them later if the error improvement is slow. The cost-bounded search and cost-based prioritization make a big difference in the search efficiency under budget constraints. -FLAML is easy to use: +## Installation + +FLAML requires **Python version >= 3.6**. It can be installed from pip: + +```bash +pip install flaml +``` + +To run the [`notebook example`](https://github.com/microsoft/FLAML/tree/main/notebook), +install flaml with the [notebook] option: + +```bash +pip install flaml[notebook] +``` + +## Quickstart * With three lines of code, you can start using this economical and fast AutoML engine as a scikit-learn style estimator. @@ -43,20 +58,11 @@ from flaml import tune tune.run(train_with_config, config={…}, low_cost_partial_config={…}, time_budget_s=3600) ``` -## Installation - -FLAML requires **Python version >= 3.6**. It can be installed from pip: - -```bash -pip install flaml -``` - -To run the [`notebook example`](https://github.com/microsoft/FLAML/tree/main/notebook), -install flaml with the [notebook] option: +## Advantages -```bash -pip install flaml[notebook] -``` +* For classification and regression tasks, find quality models with lower computational resources. +* Users can choose their desired customizability: minimal customization (computational resource budget), medium customization (e.g., scikit-style learner, search space and metric), full customization (arbitrary training and evaluation code). +* Allow human guidance in hyperparameter tuning to respect prior on certain subspaces but also able to explore other subspaces. ## Examples @@ -121,7 +127,7 @@ And they can be used in distributed HPO frameworks such as ray tune or nni. For more technical details, please check our papers. -* [FLAML: A Fast and Lightweight AutoML Library](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/). Chi Wang, Qingyun Wu, Markus Weimer, Erkang Zhu. In MLSys, 2021. +* [FLAML: A Fast and Lightweight AutoML Library](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/). Chi Wang, Qingyun Wu, Markus Weimer, Erkang Zhu. MLSys, 2021. ``` @inproceedings{wang2021flaml, title={FLAML: A Fast and Lightweight AutoML Library}, @@ -131,7 +137,7 @@ For more technical details, please check our papers. } ``` * [Frugal Optimization for Cost-related Hyperparameters](https://arxiv.org/abs/2005.01571). Qingyun Wu, Chi Wang, Silu Huang. AAAI 2021. -* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. To appear in ICLR 2021. +* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. ICLR 2021. ## Contributing diff --git a/flaml/tune/README.md b/flaml/tune/README.md index 45e55b0c5e..ad1c57214b 100644 --- a/flaml/tune/README.md +++ b/flaml/tune/README.md @@ -161,10 +161,14 @@ tune.run(... ) ``` -Recommended scenario: cost-related hyperparameters exist, a low-cost +- Recommended scenario: cost-related hyperparameters exist, a low-cost initial point is known, and the search space is complex such that local search is prone to be stuck at local optima. + +- Suggestion about using larger search space in BlendSearch: +In hyperparameter optimization, a larger search space is desirable because it is more likely to include the optimal configuration (or one of the optimal configurations) in hindsight. However the performance (especially anytime performance) of most existing HPO methods is undesirable if the cost of the configurations in the search space has a large variation. Thus hand-crafted small search spaces (with relatively homogeneous cost) are often used in practice for these methods, which is subject to idiosyncrasy. BlendSearch combines the benefits of local search and global search, which enables a smart (economical) way of deciding where to explore in the search space even though it is larger than necessary. This allows users to specify a larger search space in BlendSearch, which is often easier and a better practice than narrowing down the search space by hand. + For more technical details, please check our papers. * [Frugal Optimization for Cost-related Hyperparameters](https://arxiv.org/abs/2005.01571). Qingyun Wu, Chi Wang, Silu Huang. AAAI 2021. @@ -178,7 +182,7 @@ For more technical details, please check our papers. } ``` -* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. To appear in ICLR 2021. +* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. ICLR 2021. ``` @inproceedings{wang2021blendsearch, @@ -187,4 +191,4 @@ For more technical details, please check our papers. year={2021}, booktitle={ICLR'21}, } -``` +``` \ No newline at end of file