Skip to content

Commit

Permalink
ChaCha documentation (#100)
Browse files Browse the repository at this point in the history
* update readme

* naming
  • Loading branch information
qingyun-wu authored Jun 4, 2021
1 parent f7cf2ea commit c8da829
Show file tree
Hide file tree
Showing 4 changed files with 47 additions and 8 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,11 @@ tune.run(train_with_config, config={…}, low_cost_partial_config={…}, time_bu

* For classification and regression tasks, find quality models with lower computational resources.
* Users can choose their desired customizability: minimal customization (computational resource budget), medium customization (e.g., scikit-style learner, search space and metric), full customization (arbitrary training and evaluation code).
* Allow human guidance in hyperparameter tuning to respect prior on certain subspaces but also able to explore other subspaces.
* Allow human guidance in hyperparameter tuning to respect prior on certain subspaces but also able to explore other subspaces. Read more about the
hyperparameter optimization methods
in FLAML [here](https://github.com/microsoft/FLAML/tree/main/flaml/tune). They can be used beyond the AutoML context.
And they can be used in distributed HPO frameworks such as ray tune or nni.
* Support online AutoML: automatic hyperparameter tuning for online learning algorithms. Read more about the online AutoML method in FLAML [here](https://github.com/microsoft/FLAML/tree/main/flaml/onlineml).

## Examples

Expand Down Expand Up @@ -122,10 +126,6 @@ Please find the API documentation [here](https://microsoft.github.io/FLAML/).

Please find demo and tutorials of FLAML [here](https://www.youtube.com/channel/UCfU0zfFXHXdAd5x-WvFBk5A)

Read more about the
hyperparameter optimization methods
in FLAML [here](https://github.com/microsoft/FLAML/tree/main/flaml/tune). They can be used beyond the AutoML context.
And they can be used in distributed HPO frameworks such as ray tune or nni.

For more technical details, please check our papers.

Expand Down
39 changes: 39 additions & 0 deletions flaml/onlineml/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# ChaCha for Online AutoML

FLAML includes *ChaCha* which is an automatic hyperparameter tuning solution for online machine learning. Online machine learning has the following properties: (1) data comes in sequential order; and (2) the performance of the machine learning model is evaluated online, i.e., at every iteration. *ChaCha* performs online AutoML respecting the aforementioned properties of online learning, and at the same time respecting the following constraints: (1) only a small constant number of 'live' models are allowed to perform online learning at the same time; and (2) no model persistence or offline training is allowed, which means that once we decide to replace a 'live' model with a new one, the replaced model can no longer be retrieved.

For more technical details about *ChaCha*, please check our paper.

* ChaCha for online AutoML. Qingyun Wu, Chi Wang, John Langford, Paul Mineiro and Marco Rossi. To appear in ICML 2021.

## `AutoVW`

`flaml.AutoVW` is a realization of *ChaCha* AutoML method with online learners from the open-source online machine learning library [Vowpal Wabbit](https://vowpalwabbit.org/) learner. It can be used to tune both conventional numerical and categorical hyperparameters, such as learning rate, and hyperparameters for featurization choices, such as the namespace (a namespace is a group of features) interactions in Vowpal Wabbit.

An example of online namespace interactions tuning in VW:

```python
# require: pip install flaml[vw]
from flaml import AutoVW
'''create an AutoVW instance for tuning namespace interactions'''
autovw = AutoVW(max_live_model_num=5, search_space={'interactions': AutoVW.AUTOMATIC})
```

An example of online tuning of both namespace interactions and learning rate in VW:

```python
# require: pip install flaml[vw]
from flaml import AutoVW
from flaml.tune import loguniform
''' create an AutoVW instance for tuning namespace interactions and learning rate'''
# set up the search space and init config
search_space_nilr = {'interactions': AutoVW.AUTOMATIC, 'learning_rate': loguniform(lower=2e-10, upper=1.0)}
init_config_nilr = {'interactions': set(), 'learning_rate': 0.5}
# create an AutoVW instance
autovw = AutoVW(max_live_model_num=5, search_space=search_space_nilr, init_config=init_config_nilr)
```

A user can use the resulting AutoVW instances `autovw` in a similar way to a vanilla Vowpal Wabbit instance, i.e., `pyvw.vw`, to perform online learning by iteratively calling its `predict(data_example)` and `learn(data_example)` functions at each data example.

For more examples, please check out
[AutoVW notebook](https://github.com/microsoft/FLAML/blob/main/notebook/flaml_autovw.ipynb).
4 changes: 2 additions & 2 deletions flaml/onlineml/autovw.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ class AutoVW:
AUTO
"""
WARMSTART_NUM = 100
AUTO_STRING = '_auto'
AUTOMATIC = '_auto'
VW_INTERACTION_ARG_NAME = 'interactions'

def __init__(self,
Expand Down Expand Up @@ -94,7 +94,7 @@ def _setup_trial_runner(self, vw_example):
# setup the default search space for the namespace interaction hyperparameter
search_space = self._search_space.copy()
for k, v in self._search_space.items():
if k == self.VW_INTERACTION_ARG_NAME and v == self.AUTO_STRING:
if k == self.VW_INTERACTION_ARG_NAME and v == self.AUTOMATIC:
raw_namespaces = self.get_ns_feature_dim_from_vw_example(vw_example).keys()
search_space[k] = polynomial_expansion_set(init_monomials=set(raw_namespaces))
# setup the init config based on the input _init_config and search space
Expand Down
2 changes: 1 addition & 1 deletion notebook/flaml_autovw.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@
"from flaml import AutoVW\n",
"\n",
"'''create an AutoVW instance for tuning namespace interactions'''\n",
"autovw_ni = AutoVW(max_live_model_num=5, search_space={'interactions': AutoVW.AUTO_STRING})\n",
"autovw_ni = AutoVW(max_live_model_num=5, search_space={'interactions': AutoVW.AUTOMATIC})\n",
"\n",
"# online learning with AutoVW\n",
"loss_list_autovw_ni = online_learning_loop(max_iter_num, vw_examples, autovw_ni)\n",
Expand Down

0 comments on commit c8da829

Please sign in to comment.