Skip to content

Commit

Permalink
Blendsearch documentation (#81)
Browse files Browse the repository at this point in the history
* blendsearch documentation

Co-authored-by: Chi Wang <[email protected]>
  • Loading branch information
qingyun-wu and sonichi authored May 1, 2021
1 parent e5123f5 commit 363197c
Show file tree
Hide file tree
Showing 2 changed files with 29 additions and 19 deletions.
38 changes: 22 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,22 @@ and learner selection method invented by Microsoft Research.
FLAML leverages the structure of the search space to choose a search order optimized for both cost and error. For example, the system tends to propose cheap configurations at the beginning stage of the search,
but quickly moves to configurations with high model complexity and large sample size when needed in the later stage of the search. For another example, it favors cheap learners in the beginning but penalizes them later if the error improvement is slow. The cost-bounded search and cost-based prioritization make a big difference in the search efficiency under budget constraints.

FLAML is easy to use:
## Installation

FLAML requires **Python version >= 3.6**. It can be installed from pip:

```bash
pip install flaml
```

To run the [`notebook example`](https://github.com/microsoft/FLAML/tree/main/notebook),
install flaml with the [notebook] option:

```bash
pip install flaml[notebook]
```

## Quickstart

* With three lines of code, you can start using this economical and fast
AutoML engine as a scikit-learn style estimator.
Expand All @@ -43,20 +58,11 @@ from flaml import tune
tune.run(train_with_config, config={…}, low_cost_partial_config={…}, time_budget_s=3600)
```

## Installation

FLAML requires **Python version >= 3.6**. It can be installed from pip:

```bash
pip install flaml
```

To run the [`notebook example`](https://github.com/microsoft/FLAML/tree/main/notebook),
install flaml with the [notebook] option:
## Advantages

```bash
pip install flaml[notebook]
```
* For classification and regression tasks, find quality models with lower computational resources.
* Users can choose their desired customizability: minimal customization (computational resource budget), medium customization (e.g., scikit-style learner, search space and metric), full customization (arbitrary training and evaluation code).
* Allow human guidance in hyperparameter tuning to respect prior on certain subspaces but also able to explore other subspaces.

## Examples

Expand Down Expand Up @@ -121,7 +127,7 @@ And they can be used in distributed HPO frameworks such as ray tune or nni.

For more technical details, please check our papers.

* [FLAML: A Fast and Lightweight AutoML Library](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/). Chi Wang, Qingyun Wu, Markus Weimer, Erkang Zhu. In MLSys, 2021.
* [FLAML: A Fast and Lightweight AutoML Library](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/). Chi Wang, Qingyun Wu, Markus Weimer, Erkang Zhu. MLSys, 2021.
```
@inproceedings{wang2021flaml,
title={FLAML: A Fast and Lightweight AutoML Library},
Expand All @@ -131,7 +137,7 @@ For more technical details, please check our papers.
}
```
* [Frugal Optimization for Cost-related Hyperparameters](https://arxiv.org/abs/2005.01571). Qingyun Wu, Chi Wang, Silu Huang. AAAI 2021.
* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. To appear in ICLR 2021.
* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. ICLR 2021.

## Contributing

Expand Down
10 changes: 7 additions & 3 deletions flaml/tune/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,10 +161,14 @@ tune.run(...
)
```

Recommended scenario: cost-related hyperparameters exist, a low-cost
- Recommended scenario: cost-related hyperparameters exist, a low-cost
initial point is known, and the search space is complex such that local search
is prone to be stuck at local optima.


- Suggestion about using larger search space in BlendSearch:
In hyperparameter optimization, a larger search space is desirable because it is more likely to include the optimal configuration (or one of the optimal configurations) in hindsight. However the performance (especially anytime performance) of most existing HPO methods is undesirable if the cost of the configurations in the search space has a large variation. Thus hand-crafted small search spaces (with relatively homogeneous cost) are often used in practice for these methods, which is subject to idiosyncrasy. BlendSearch combines the benefits of local search and global search, which enables a smart (economical) way of deciding where to explore in the search space even though it is larger than necessary. This allows users to specify a larger search space in BlendSearch, which is often easier and a better practice than narrowing down the search space by hand.

For more technical details, please check our papers.

* [Frugal Optimization for Cost-related Hyperparameters](https://arxiv.org/abs/2005.01571). Qingyun Wu, Chi Wang, Silu Huang. AAAI 2021.
Expand All @@ -178,7 +182,7 @@ For more technical details, please check our papers.
}
```

* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. To appear in ICLR 2021.
* [Economical Hyperparameter Optimization With Blended Search Strategy](https://www.microsoft.com/en-us/research/publication/economical-hyperparameter-optimization-with-blended-search-strategy/). Chi Wang, Qingyun Wu, Silu Huang, Amin Saied. ICLR 2021.

```
@inproceedings{wang2021blendsearch,
Expand All @@ -187,4 +191,4 @@ For more technical details, please check our papers.
year={2021},
booktitle={ICLR'21},
}
```
```

0 comments on commit 363197c

Please sign in to comment.