Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance README.md #265

Merged
merged 9 commits into from
Aug 1, 2024
Merged
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
179 changes: 98 additions & 81 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@ Optuna Examples

This page contains a list of example codes written with Optuna.

The simplest codeblock looks like this:
<details open>
<summary>Simplest Codeblock</summary>

```python
import optuna
Expand All @@ -20,16 +21,42 @@ if __name__ == "__main__":
study.optimize(objective, n_trials=1000, timeout=3)
print(f"Best params is {study.best_params} with value {study.best_value}")
```
</details>

The examples below provide codeblocks similar to the example above for various different scenarios.
> [!NOTE]
> If you are interested in a quick start of [Optuna Dashboard](https://github.com/optuna/optuna-dashboard) with in-memory storage, please take a look at [this example](./dashboard/run_server_simple.py).

### Simple Black-box Optimization
> [!TIP]
> Couldn't find your usecase?
> [FAQ](https://optuna.readthedocs.io/en/stable/faq.html) might be helpful for you to implement what you want.
> In this example repository, you can also find the examples for the following scenarios:
> 1. [Objective function with additional arguments](./sklearn/sklearn_additional_args.py), which is useful when you would like to pass arguments besides `trial` to your objective function.
>
> 2. [Manually provide trials with sampler](./enqueue_trial.py), which is useful when you would like to force certain parameters to be sampled.
>
> 3. [Callback to control the termination criterion of study](./max_trials_callback.py), which is useful when you would like to define your own termination criterion other than `n_trials` or `timeout`.

* [Quadratic function](./quadratic_simple.py)
* [Quadratic multi-objective function](./multi_objective/quadratic_simple.py)
* [Quadratic function with constraints](./quadratic_simple_constraint.py)
## Examples for Diverse Problem Setups

### Examples with ML Libraries
Here are the URLs to the example codeblocks to the corresponding setups.

<details open>
<summary>Simple Black-box Optimization</summary>

* [Quadratic Function](./quadratic_simple.py)
* [Quadratic Multi-Objective Function](./multi_objective/quadratic_simple.py)
* [Quadratic Function with Constraints](./quadratic_simple_constraint.py)
</details>

<details open>
<summary>Multi-Objective Optimization</summary>

* [Optimization with BoTorch](./multi_objective/botorch_simple.py)
* [Optimization of Multi-Layer Perceptron with PyTorch](./multi_objective/pytorch_simple.py)
</details>

<details open>
<summary>Machine Learning (Incl. LightGBMTuner and OptunaSearchCV)</summary>

* [AllenNLP](./allennlp/allennlp_simple.py)
* [AllenNLP (Jsonnet)](./allennlp/allennlp_jsonnet.py)
Expand All @@ -56,118 +83,108 @@ The examples below provide codeblocks similar to the example above for various d
* [Tensorflow (eager)](./tensorflow/tensorflow_eager_simple.py)
* [XGBoost](./xgboost/xgboost_simple.py)

### An example of Optuna Dashboard

The following example demonstrates how to use [Optuna Dashboard](https://github.com/optuna/optuna-dashboard).

* [Starting Optuna Dashboard with in-memory storage](./dashboard/run_server_simple.py)

### An example where an objective function uses additional arguments

The following example demonstrates how to implement an objective function that uses additional arguments other than `trial`.
If you are looking for an example of reinforcement learning, please take a look at the following:
* [Optimization of Hyperparameters for Stable-Baslines Agent](./rl/sb3_simple.py)

* [Scikit-learn (callable class version)](./sklearn/sklearn_additional_args.py)
</details>

### Examples of Pruning
<details open>
<summary>Pruning</summary>

The following example demonstrates how to implement pruning logic with Optuna.

* [Simple pruning (scikit-learn)](./simple_pruning.py)

In addition, integration modules are available for the following libraries, providing simpler interfaces to utilize pruning.

* [Pruning with Catalyst integration module](./pytorch/catalyst_simple.py)
* [Pruning with CatBoost integration module](./catboost/catboost_pruning.py)
* [Pruning with Chainer integration module](./chainer/chainer_integration.py)
* [Pruning with ChainerMN integration module](./chainer/chainermn_integration.py)
* [Pruning with FastAI integration module](./fastai/fastai_simple.py)
* [Pruning with Keras integration module](./keras/keras_integration.py)
* [Pruning with LightGBM integration module](./lightgbm/lightgbm_integration.py)
* [Pruning with PyTorch integration module](./pytorch/pytorch_simple.py)
* [Pruning with PyTorch Ignite integration module](./pytorch/pytorch_ignite_simple.py)
* [Pruning with PyTorch Lightning integration module](./pytorch/pytorch_lightning_simple.py)
* [Pruning with PyTorch Lightning integration module (DDP)](./pytorch/pytorch_lightning_ddp.py)
* [Pruning with Tensorflow integration module](./tensorflow/tensorflow_estimator_integration.py)
* [Pruning with XGBoost integration module](./xgboost/xgboost_integration.py)
* [Pruning with XGBoost integration module (cross validation, XGBoost.cv)](./xgboost/xgboost_cv_integration.py)

### Examples of Samplers
* [Pruning with Catalyst Integration Module](./pytorch/catalyst_simple.py)
* [Pruning with CatBoost Integration Module](./catboost/catboost_pruning.py)
* [Pruning with Chainer Integration Module](./chainer/chainer_integration.py)
* [Pruning with ChainerMN Integration Module](./chainer/chainermn_integration.py)
* [Pruning with FastAI Integration Module](./fastai/fastai_simple.py)
* [Pruning with Keras Integration Module](./keras/keras_integration.py)
* [Pruning with LightGBM Integration Module](./lightgbm/lightgbm_integration.py)
* [Pruning with PyTorch Integration Module](./pytorch/pytorch_simple.py)
* [Pruning with PyTorch Ignite Integration Module](./pytorch/pytorch_ignite_simple.py)
* [Pruning with PyTorch Lightning Integration Module](./pytorch/pytorch_lightning_simple.py)
* [Pruning with PyTorch Lightning Integration Module (DDP)](./pytorch/pytorch_lightning_ddp.py)
* [Pruning with Tensorflow Integration Module](./tensorflow/tensorflow_estimator_integration.py)
* [Pruning with XGBoost Integration Module](./xgboost/xgboost_integration.py)
* [Pruning with XGBoost Integration Module (Cross Validation Version)](./xgboost/xgboost_cv_integration.py)
</details>

<details open>
<summary>Samplers</summary>

* [Warm Starting CMA-ES](./samplers/warm_starting_cma.py)

### Examples of User-Defined Sampler

If you are interested in defining a user-defined sampler, here is an example:
* [SimulatedAnnealingSampler](./samplers/simulated_annealing_sampler.py)
</details>

### Examples of Terminator
<details open>
<summary>Terminator</summary>

* [Optuna Terminator](./terminator/terminator_simple.py)
* [OptunaSearchCV with Terminator](./terminator/terminator_search_cv.py)
</details>

### Examples of Multi-Objective Optimization

* [Optimization with BoTorch](./multi_objective/botorch_simple.py)
* [Optimization of MLP with PyTorch](./multi_objective/pytorch_simple.py)

### Examples of Visualization

* [Visualizing study](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/visualization/plot_study.ipynb)
* [Visualizing study with HiPlot](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/hiplot/plot_study.ipynb)
<details open>
<summary>Visualization</summary>

### An example to enqueue trials with given parameter values
* [Visualizing Study](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/visualization/plot_study.ipynb)
* [Visualizing Study with HiPlot](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/hiplot/plot_study.ipynb)
</details>

* [Enqueuing trials with given parameters](./enqueue_trial.py)
<details open>
<summary>Distributed Optimization</summary>

### Examples of aim

* [Tracking optimization process with aim](./aim/aim_integration.py)

### Examples of MLflow

* [Tracking optimization process with MLflow](./mlflow/keras_mlflow.py)

### Examples of Weights & Biases

* [Tracking optimization process with Weights & Biases](./wandb/wandb_integration.py)

### Examples of Hydra

* [Optimization with Hydra](./hydra/simple.py)

### Examples of Distributed Optimization

* [Optimizing on a Dask cluster](./dask/dask_simple.py)
* [Optimizing on Dask Cluster](./dask/dask_simple.py)
* [Optimizing on Kubernetes](./kubernetes/README.md)
* [Optimizing with Ray's joblib backend](./ray/ray_joblib.py)
* [Optimizing with Ray's Joblib Backend](./ray/ray_joblib.py)
</details>

### Examples of Reinforcement Learning
<details open>
<summary>MLOps Platform</summary>

* [Optimization of Hyperparameters for Stable-Baslines Agent](./rl/sb3_simple.py)
* [Tracking Optimization Process with aim](./aim/aim_integration.py)
* [Tracking Optimization Process with MLflow](./mlflow/keras_mlflow.py)
* [Tracking Optimization Process with Weights & Biases](./wandb/wandb_integration.py)
* [Optimization with Hydra](./hydra/simple.py)
</details>

### External projects using Optuna
<details open>
<summary>Real-World Optuna Examples by External Projects</summary>
nabenabe0928 marked this conversation as resolved.
Show resolved Hide resolved

* [Hugging Face Trainer's hyperparameter search](https://huggingface.co/docs/transformers/main/main_classes/trainer#transformers.Trainer.hyperparameter_search)
* [Hugging Face Trainer's Hyperparameter Search](https://huggingface.co/docs/transformers/main/main_classes/trainer#transformers.Trainer.hyperparameter_search)
* [Allegro Trains](https://github.com/allegroai/trains)
* [BBO-Rietveld: Automated crystal structure refinement](https://github.com/quantumbeam/BBO-Rietveld)
* [BBO-Rietveld: Automated Crystal Structure Refinement](https://github.com/quantumbeam/BBO-Rietveld)
* [Catalyst](https://github.com/catalyst-team/catalyst)
* [CuPy](https://github.com/cupy/cupy)
* [Hydra's Optuna Sweeper plugin](https://hydra.cc/docs/next/plugins/optuna_sweeper/)
* [Hydra's Optuna Sweeper Plugin](https://hydra.cc/docs/next/plugins/optuna_sweeper/)
* [Mozilla Voice STT](https://github.com/mozilla/DeepSpeech)
* [neptune.ai](https://neptune.ai)
* [OptGBM: A scikit-learn compatible LightGBM estimator with Optuna](https://github.com/Y-oHr-N/OptGBM)
* [OptGBM: A scikit-learn Compatible LightGBM Estimator with Optuna](https://github.com/Y-oHr-N/OptGBM)
* [Optuna-distributed](https://github.com/xadrianzetx/optuna-distributed)
* [PyKEEN](https://github.com/pykeen/pykeen)
* [RL Baselines Zoo](https://github.com/DLR-RM/rl-baselines3-zoo)
* [Hyperparameter Optimization for Machine Learning, code repository for online course](https://github.com/solegalli/hyperparameter-optimization)
* [Hyperparameter Optimization for Machine Learning, Code Repository for Online Course](https://github.com/solegalli/hyperparameter-optimization)
</details>

PRs to add additional projects welcome!
> [!IMPORTANT]
> PRs to add additional real-world examples or projects are welcome!

### Running with Optuna's Docker images?

You can use our docker images with the tag ending with `-dev` to run most of the examples.
For example, you can run [PyTorch Simple](./pytorch/pytorch_simple.py) via `docker run --rm -v $(pwd):/prj -w /prj optuna/optuna:py3.7-dev python pytorch/pytorch_simple.py`.
Also, you can try our visualization example in Jupyter Notebook by opening `localhost:8888` in your browser after executing this:
Our Docker images for most examples are available with the tag ending with `-dev`.
For example, [PyTorch Simple](./pytorch/pytorch_simple.py) can be run via:

```bash
$ docker run --rm -v $(pwd):/prj -w /prj optuna/optuna:py3.7-dev python pytorch/pytorch_simple.py
```

Additionally, our visualization example can also be run on Jupyter Notebook by opening `localhost:8888` in your browser after executing the following:

```bash
docker run -p 8888:8888 --rm optuna/optuna:py3.7-dev jupyter notebook --allow-root --no-browser --port 8888 --ip 0.0.0.0 --NotebookApp.token='' --NotebookApp.password=''
$ docker run -p 8888:8888 --rm optuna/optuna:py3.7-dev jupyter notebook --allow-root --no-browser --port 8888 --ip 0.0.0.0 --NotebookApp.token='' --NotebookApp.password=''
```